RecruitingBlogscom

Follow Us:

 

data availability

Results 51 - 75 of 315Sort Results By: Published Date | Title | Company Name
Published By: IBM     Published Date: Jul 26, 2017
To compete in today’s fast-paced business climate, enterprises need accurate and frequent sales and customer reports to make real-time operational decisions about pricing, merchandising and inventory management. They also require greater agility to respond to business events as they happen, and more visibility into business activities so information and systems are optimized for peak efficiency and performance. By making use of data capture and business intelligence to integrate and apply data across the enterprise, organizations can capitalize on emerging opportunities and build a competitive advantage. The IBM® data replication portfolio is designed to address these issues through a highly flexible one-stop shop for high-volume, robust, secure information replication across heterogeneous data stores. The portfolio leverages real-time data replication to support high availability, database migration, application consolidation, dynamic warehousing, master data management (MDM), service
Tags : 
ibm, infosphere, data replication, security, data storage
    
IBM
Published By: NetApp     Published Date: Mar 05, 2018
To keep pace with an increasingly digital world, enterprises are transforming their data infrastructures using all flash storage. As a leading all flash storage provider, NetApp simplifies your infrastructure to improve economics, while accelerating performance and increasing availability to enhance your company’s competitiveness. NetApp future-proofs your IT investments, allowing you to grow with confidence. NetApp® all flash storage reduces your storage footprint, power, and cooling by up to 10x; doubles performance at half the latency of leading competitors; and lets you migrate confidently from your existing SAN with a pathway to the cloud. With NetApp, all flash arrays, your business is prepared to take on anything and everything the future can throw at it: rapid growth, new technology, or a shift in the industry. Cut fear out of the equation. Be data ready to bring on the future.
Tags : 
netapp, database performance, flash storage, data management, cost challenges
    
NetApp
Published By: Oracle ODA     Published Date: Aug 15, 2016
Businesses understand more than ever that they depend on data for insight and competitive advantage. And when it comes to data, they have always wanted easy access and fast performance. But how is the situation different now? Today, organizations want those elements and more. They want IT to strip away the limitations of time with faster deployment of new databases and applications. They want IT to reduce the limitations of distance by giving remote and branch offices better and more reliable access. And in a global world where business never stops, they want IT to ensure data availability around the clock. If IT can deliver databases and applications faster, on a more automated and consistent basis, to more locations without having to commit onsite resources, IT will be free to focus on more strategic projects.
Tags : 
    
Oracle ODA
Published By: Oracle ODA     Published Date: Aug 15, 2016
Oracle added two new models to the Oracle Database Appliance family in addition to the existing high availability model. With an entry list price starting at a fourth of the cost of the prior generation Oracle Database Appliance hardware and flexible Oracle Database software licensing, these new models bring Oracle Engineered Systems to within reach of every organization. Read about how the Oracle Database Appliance X-6 series expands the reach of the database appliance family to support various workloads, deployment scenarios, and database editions. They are especially designed for customers requiring only single instance databases, but who desire the simplicity, optimization, and affordability of the Oracle Database Appliance. These new models are ideal for customers who seek to avoid the complexity, tuning requirements, and higher costs of “build-your-own” database solutions.
Tags : 
    
Oracle ODA
Published By: CA Technologies     Published Date: Jul 20, 2017
Mainframes continue to provide high business value by combining efficient transaction processing with high-volume access to critical enterprise data. Business organizations are linking mobile devices to mainframe processing and data to support digital applications and drive business transformation. In this rapidly growing scenario, the importance of providing excellent end-user experience becomes critical for business success.This analyst announcement note covers how CA Technologies is addressing the need for providing high availability and a fast response time by optimizing mainframe performance with new machine learning and analytics capabilities.
Tags : 
    
CA Technologies
Published By: Pure Storage     Published Date: Nov 02, 2017
The tremendous growth of unstructured data is creating huge opportunities for organizations. But it is also creating significant challenges for the storage infrastructure. Many application environments that have the potential to maximize unstructured data have been restricted by the limitations of legacy storage systems. For the past several years—at least—users have expressed a need for storage solutions that can deliver extreme performance along with simple manageability, density, high availability and cost efficiency.
Tags : 
high performance, tco, multi protocol, management simplicity, the blade, elasticity software, performance, capacity
    
Pure Storage
Published By: Carbonite     Published Date: Oct 12, 2017
Carbonite provides data protection solutions for businesses and the IT professionals who serve them. Our product suite provides a full complement of backup, disaster recovery, and high availability solutions for any size business in any location around the world, all supported by a state-of-the-art global infrastructure.
Tags : 
draas, data recovery, recovery capability, cloud support, infrastructure
    
Carbonite
Published By: Datastax     Published Date: Dec 27, 2018
Today’s data volume, variety, and velocity has made relational database nearly obsolete for handling certain types of workloads. But it’s also put incredible strain on regular NoSQL databases. The key is to find one that can deliver the infinite scale and high availability required to support high volume, web-scale applications in clustered environments. This white paper details the capabilities and uses case of an Active Everywhere database
Tags : 
    
Datastax
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Feb 18, 2019
The largest national multiline insurance had built a repository of Insurance policies (P&C and Life Insurance) on Microfilm and Microfiche in early 90’s, as a preservation strategy. They were grappling with issues as this technology became outdated over time: • Risk of losing their only source of data for Insurance policies and corresponding communication, need to improve data availability and speed of claims evaluation • Compliance issues, need of a WORM (write once read many) storage compliant with FINRA regulations, where data should be encrypted when at rest • Total cost for digitization compared to 10-12 years of support left to maintain insurance policies was not very encouraging • Required a low cost, cloud-based, FINRA-compliant document management solution which could provide quick access to stored data Download complete case study to know how LTI’s e-Office sDownload full case study to know how LTI’s e-Office solution enabled 50% TCO for Largest national Multiline Insurance.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: IBM     Published Date: Jul 05, 2016
This white paper discusses the concept of shared data scale-out clusters, as well as how they deliver continuous availability and why they are important for delivering scalable transaction processing support.
Tags : 
ibm, always on business, cloud, big data, oltp, ibm db2 purescale, networking, knowledge management, enterprise applications, data management, business technology, data center
    
IBM
Published By: CDW     Published Date: Nov 12, 2012
In this article, you'll find new power-saving and measurement technologies, along with maturing best practices that can help IT managers implement comprehensive strategies to better rein in energy costs.
Tags : 
data center, power and cooling, availability, data center optimization, business technology
    
CDW
Published By: NetApp     Published Date: Sep 24, 2013
"Storage system architectures are moving away from monolithic scale-up approaches and adopting scale-out storage – providing a powerful and flexible way to respond to the inevitable data growth and data management challenges in today’s environments. With extensive data growth demands, there needs to be an increase in the levels of storage and application availability, performance, and scalability. Access this technical report that provides an overview of NetApp clustered Data ONTAP 8.2 and shows how it incorporates industry-leading unified architecture, non-disruptive operations, proven storage efficiency, and seamless scalability."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, non-disruptive operations, storage, business technology, data center
    
NetApp
Published By: NetApp     Published Date: Sep 24, 2013
"Today, IT’s customers are more mobile and global than ever before and as such expect their applications and data to be available 24x7. Interruptions, whether planned or unplanned, can have a major impact to the bottom line of the business. ESG Lab tested the ability of clustered Data ONTAP to provide continuous application availability and evaluated performance for both SAN and NAS configurations while running an Oracle OLTP workload. Check out this report to see the results."
Tags : 
mobile, global, applications, cloud, configuration, technology, knowledge management, storage, data center
    
NetApp
Published By: Nimble Storage     Published Date: Feb 26, 2016
Download this eBook to learn the steps you can take now to prepare for the all flash data center. flash storage, SSD, all flash data centers, nimble storage, predictive flash platform, application perfomance, data velocity
Tags : 
flash storage, ssd, all flash data centers, nimble storage, predictive flash platform, application perfomance, data velocity, data protection, high availability, big data, predictive analytics, data center
    
Nimble Storage
Published By: Dell EMC     Published Date: Aug 03, 2015
XtremIO reduces datacenter footprint and complexity with unstoppable in-line data reduction capabilities which address storage sprawl for Exchange databases. Thin provisioning doesn’t just eliminate space at the end of the drive, it eliminates all of the whitespace found within the database!
Tags : 
data reduction, deployment approaches, emc, exchange server, database availability, compression, cloud infrastructure, datacenter footprint reduction
    
Dell EMC
Published By: Schneider Electric     Published Date: Feb 12, 2018
Internet use is trending towards bandwidth-intensive content and an increasing number of attached “things”. At the same time, mobile telecom networks and data networks are converging into a cloud computing architecture. To support needs today and tomorrow, computing power and storage is being inserted out on the network edge in order to lower data transport time and increase availability. Edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. This white paper explains the drivers of edge computing and explores the various types of edge computing available.
Tags : 
    
Schneider Electric
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day One” challenges of deploying, managing and monitoring PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. An effective monitoring and logging strategy is critical for maintaining the reliability, availability, and performance of database environments. The second section of this eBook provides a detailed analysis of all aspects of monitoring and logging PostgreSQL: ? Monitoring KPIs ? Metrics and stats ? Monitoring tools ? Passive monitoring versus active notifications
Tags : 
    
Stratoscale
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
    
Stratoscale
Published By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : 
scalability, hybrid environment, emerging applications, rapid reporting
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Arbor     Published Date: Feb 07, 2013
HOW TO PROTECT YOUR DATA CENTER’S AVAILABILITY
Tags : 
arbor, firewalls, ddos protection, intrusion prevention systems, ips, business technology
    
Arbor
Published By: Citrix Systems     Published Date: Jul 28, 2015
At long last, datacenters have a more efficient high-availability deployment option for application delivery controllers. According to new research, that’s welcome news among IT leaders.
Tags : 
    
Citrix Systems
Published By: Scalebase     Published Date: Mar 08, 2013
Learn how to scale MySQL databases with ScaleBase. Cost-effectively scale to an infinite number of users, with NO disruption to your existing infrastructure.
Tags : 
mysql, databases with scalebase, no existing infrastructure issues, use mysql with scalebase, cost effectively scale, shard, cluster, high availability, failover, mariadb, read/write, scalability, capacity planning, it management, data management, business technology, data center
    
Scalebase
Published By: NetApp     Published Date: Sep 22, 2014
The NetApp EF series of all-flash arrays are designed specifically for database-driven environments demanding maximum performance, reliability, and availability. This ESG Lab Report documents the real world performance, reliability, availability, and serviceability of NetApp EF-Series flash arrays in Oracle database environments. A combination of hands on testing by ESG Lab and audited in-house performance testing executed by NetApp were used to create this report. In this report, you’ll learn how ESG validated NetApp’s EF-550 flash array performance of over 400,000 IOPS with sub-millisecond latency, while maintaining 6 nine’s availability.
Tags : 
flash arrays, performance-driven databases, enterprise storage, database environment, serviceability, real world performance
    
NetApp
Published By: Equinix     Published Date: Oct 27, 2014
Enterprises grapple with a host of challenges that are spurring the creation of hybrid clouds: collections of computing infrastructure spread across multiple data centers and multiple cloud providers. This new concept often provokes uncertainty, which must be addressed head on. As more applications and computing resources move to the cloud, enterprises will become more dependent on cloud vendors, whether the issue is access, hosting, management, or any number of other services. Cloud consumers want to avoid vendor lock-in—having only one cloud provider. They want to know that they will have visibility into data and systems across multiple platforms and providers. They want to be able to move servers and storage around without a negative impact on application availability.
Tags : 
data center, enterprise, cloud, experience, hybrid, performance, strategy, interconnectivity, network, drive, evolution, landscape, server, mobile, technology, globalization, stem, hyperdigitization, consumer, networking
    
Equinix
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

© 2019  Created by RecruitingBlogs.   Powered by

Badges  |  Report an Issue  |  Terms of Service

scroll to the top