define data

Results 1 - 25 of 240Sort Results By: Published Date | Title | Company Name
Published By: MicroStrategy     Published Date: Aug 21, 2019
To survive and thrive in an era of accelerating digital disruption, organizations require accessible data, actionable insights, continuous innovation, and disruptive business models. It’s no longer enough to prioritize and implement analytics – leaders are being challenged to stop doing analytics just for analytics’ sake and focus on defined business outcomes. In addition, these leaders are being challenged to bring predictive capabilities and even prescriptive recommended actions into production at scale. As AI and accelerated growth and transformation become top of mind, many enterprises are realizing that their current segmented analytics approach isn’t built to last, and that real transformation will require proper endto- end data management, data security, and a data processing platform company-wide. The year 2019 will be a turning point for many organizations that realize being data-driven doesn’t guarantee future success.
Tags : 
    
MicroStrategy
Published By: Nutanix, Inc.     Published Date: May 02, 2013
Enterprise data-centers are straining to keep pace with dynamic business demands, as well as to incorporate advanced technologies and architectures that aim to improve infrastructure performance, scale and economics. meeting these requirements, however, often requires a complete rethinking of how data centers are designed and managed. Fortunately, many enterprise IT architects are leading cloud providers have already demonstrated the viability and the benefits of a more modern, software-defined data center. This Nutanix white paper examines eight fundamental steps leading to a more efficient, manageable and scalable data center.
Tags : 
nutanix, data-centers, dynamic business demands, advanced technologies, it management, knowledge management, data management
    
Nutanix, Inc.
Published By: Cisco     Published Date: Jan 05, 2015
The data center has gone through many major evolutionary changes over the past several decades, and each change has been defined by major shifts in architectures. The industry moved from the mainframe era to client/server computing and then to Internet computing. In 2011, another major shift began: the shift to a virtual data center. This has been the primary driver in enabling customers to transition to the cloud and ultimately IT as a service. The shift to a virtual data center will be the single biggest transition in the history of computing. It will reshape all the major data center tiers: applications, storage, servers and the network.
Tags : 
data center, consolidation, server virtualization, new applications, it management, data management
    
Cisco
Published By: SAS     Published Date: Apr 25, 2017
But if you can’t explain how you got the answer, or what it means, it’s no good. Most self-service BI solutions can only display what has already happened, through reports or dashboards. And most have a predefined path of analysis that gives users very little creative freedom to explore new lines of thought. To maintain competitive advantage, your BI solution should allow business users to quickly and easily investigate and interrogate the data to find out why something happened – to uncover the root cause behind the “what.”
Tags : 
    
SAS
Published By: Cisco     Published Date: Mar 22, 2019
Cisco ACI, the industry-leading software-defined networking solution, facilitates application agility and data center automation. With ACI Anywhere, enable scalable multicloud networks with a consistent policy model, and gain the flexibility to move applications seamlessly to any location or any cloud while maintaining security and high availability.
Tags : 
    
Cisco
Published By: Corrigo     Published Date: Sep 12, 2019
We’re living in a new era defined by data, analytics, and intelligence. It can sound overwhelming, but once you understand how to make all this information work for you, it becomes really exciting. The Intelligence Economy has already changed our day-to-day lives, and it’s revolutionizing the world of facilities management. The best FMs don’t just understand this change – they embrace it. With a better handle on intelligence, you will see immediate results. From warranty enforcement to asset repair vs. replacement decisions, Corrigo’s collective intelligence can improve your bottom line and make you better at your job. Read on to learn how Corrigo makes the Intelligence Economy work for you.
Tags : 
    
Corrigo
Published By: VMware     Published Date: Jun 04, 2019
"In an era where speed and performance are critical, moving to a software-centric approach in every area of the data center is the only way to get ahead in today's digital economy. A modern, software-defined infrastructure, enables organizations to leverage prior investments, extend existing IT knowledge and minimize disruption along the way. VMware and Intel provide IT organizations a path to digital transformation, delivering consistent infrastructure and consistent operations across data centers and public clouds to accelerate application speed and agility for business innovation and growth."
Tags : 
    
VMware
Published By: SAP     Published Date: May 18, 2014
This TDWI Checklist Report presents requirements for analytic DBMSs with a focus on their use with big data. Along the way, the report also defines the many techniques and tool types involved. The requirements checklist and definitions can assist users who are currently evaluating analytic databases and/or developing strategies for big data analytics.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: Datastax     Published Date: Apr 04, 2017
An an Enterprise Architect (or an aspiring one), your job is to help define, build, and manage your company's technology architecture for its single most important asset - its information - in order to meet the company's business goals. Read this comprehensive guide to learn how all the key areas of database design, creation, security, object management, backup/recovery, monitoring and tuning, data migrations, and more - are carried out in a NoSQL database.
Tags : 
architect, guide, nosql, datastax
    
Datastax
Published By: Datastax     Published Date: Aug 07, 2018
"As an Enterprise Architect (or an aspiring one), your job is to help define, build, and manage your company's technology architecture for its single most important asset - its information - in order to meet your company's business goals. Read this comprehensive guide to learn the ins and outs of designing data management architectures to manage mixed workloads at scale."
Tags : 
    
Datastax
Published By: Datastax     Published Date: Aug 15, 2018
"As an Enterprise Architect (or an aspiring one), your job is to help define, build, and manage your company's technology architecture for its single most important asset - its information - in order to meet your company's business goals. Read this comprehensive guide to learn the ins and outs of designing data management architectures to manage mixed workloads at scale."
Tags : 
    
Datastax
Published By: Dome9     Published Date: Apr 25, 2018
AWS provides powerful controls to manage the security of software-defined infrastructure and cloud workloads, including virtual networks for segmentation, DDoS mitigation, data encryption, and identity and access control. Because AWS enables rapid and elastic scalability, the key to securing cloud environments is using security automation and orchestration to effectively implement consistent protection across your AWS environment. The following eBook will discuss Dome9 best practices for using AWS controls to establish a strict security posture that addresses your unique business needs, and maintaining consistency across regions, accounts, and Virtual Private Clouds (VPCs) as your environment grows.
Tags : 
    
Dome9
Published By: VMTurbo     Published Date: Mar 25, 2015
An Intelligent Roadmap for Capacity Planning Many organizations apply overly simplistic principles to determine requirements for compute capacity in their virtualized data centers. These principles are based on a resource allocation model which takes the total amount of memory and CPU allocated to all virtual machines in a compute cluster, and assumes a defined level of over provisioning (e.g. 2:1, 4:1, 8:1, 12:1) in order to calculate the requirement for physical resources. Often managed in spreadsheets or simple databases, and augmented by simple alert-based monitoring tools, the resource allocation model does not account for actual resource consumption driven by each application workload running in the operational environment, and inherently corrodes the level of efficiency that can be driven from the underlying infrastructure.
Tags : 
capacity planning, vmturbo, resource allocation model, cpu, cloud era, it management, knowledge management
    
VMTurbo
Published By: Cisco     Published Date: Jul 11, 2016
Cisco estimates that the Internet of Everything (IoE) — the networked connection of people, process, data, and things — will generate $19 trillion in Value at Stake for the private and public sectors combined between 2013 and 2022. More than 42 percent of this value — $8 trillion — will come from one of IoE’s chief enablers, the Internet of Things (IoT). Defined by Cisco as “the intelligent connectivity of physical devices, driving massive gains in efficiency, business growth, and quality of life,” IoT often represents the quickest path to IoE value for private and public sector organizations. This paper combines original and secondary research, as well as economic analysis, to provide a roadmap for maximizing value from IoT investments. It also explains why, in the worlds of IoT and IoE, the combination of edge computing/analytics and data center/cloud is essential to driving actionable insights that produce improved business outcomes.
Tags : 
    
Cisco
Published By: VMWare EMEA     Published Date: Oct 23, 2017
Organizations that automate their highly-virtualized infrastructure environments discover that they are able to introduce new forms of self-service and on-demand IT delivery models. As a result, they’re far better positioned to compete in a world where the quality and performance of applications increasingly dictates success and failure. On the journey to agility, the ultimate destination remains infrastructure defined by software and operating in a way that’s both application-aware and self-healing. For enterprises competing in the digital economy, deploying a CMP to manage increasingly complex infrastructure has become an indispensable part of that journey. Learn more about vRealize Suite and our strategy for the Software-Defined Data Center.
Tags : 
cloud optimization, cloud efficiency, cloud management, cloud assurance, cloud visibility, enterprise management, data management
    
VMWare EMEA
Published By: Juniper Networks     Published Date: Aug 08, 2017
Cloud, social, big data, and the Internet of Things (IoT) are increasingly central to business decisions as the pace of digitization accelerates. The impact of software-defined networking (SDN), virtualization, and converged and hyperconverged infrastructure within the datacenter is substantial. These technologies add complexity but offer enticing opportunities for new business models, revenue streams, operating efficiencies, and agility that organizations must pursue if they want to remain competitive and viable. This pursuit requires businesses to keep up with current and emerging technologies and applications and transform the ways in which they conduct business. At the core of "keeping up" is an organization's datacenter strategy — with an associated technology and services strategy that will either create industry laggards or accelerate innovators.
Tags : 
    
Juniper Networks
Published By: Microsoft Azure     Published Date: Apr 11, 2018
When you extend the global reach of your enterprise, you’ll find new markets for your products and services. That means reaching more potential customers, bigger growth potential, and higher ROI. But to tap into those emerging markets, you need to provide the best, most consistent user experience. Now, it’s possible for you to build, deploy, and manage modern apps at scale with a globally-distributed database—without the hassles associated with hosting in your data center. Read the e-book Build Modern Apps with Big Data at a Global Scale and learn how Azure Cosmos DB, a globally-distributed turnkey database service, is transforming the world of modern data management. Keep access to your data available, consistent, and safe—with industry-leading, enterprise-grade security and compliance. Start developing the best app experience for your users based on five well-defined consistency models: Strong: Favors data consistency. Ideal for banks, e-commerce processing, and online booking. Boun
Tags : 
    
Microsoft Azure
Published By: Red Hat     Published Date: Nov 30, 2015
A step-by-step guide for adopting software-defined storage as part of a practical, proactive strategy for managing enterprise data. Flexibility is inherent in this technology, and now more than ever, a flexible IT architecture might be the key to facing the once unthinkable challenges of big data.
Tags : 
software-defined storage, enterprise data, it architecture, big data
    
Red Hat
Published By: Dell     Published Date: Aug 16, 2013
"Virtualization has changed the data center dynamic from static to fluid. While workloads used to be confined to the hardware on which they were installed, workloads today flow from host to host based on administrator-defined rules, as well as in reaction to changes in the host environment. The fluidic nature of the new data center has brought challenges to resource allocation; find out how your organization can stay ahead of the curve. Read the White Paper"
Tags : 
best practices, cpu, memory, storage, vsphere, virtual
    
Dell
Published By: IBM     Published Date: Nov 07, 2016
Is your software defined infrastructure (SDI) for high performance computing (HPC) and big data analytics meeting the needs of your growing business? Would you like to know how to justify the switching cost from unsupported open source software to a commercial grade SDI that ensures your resources are more effectively used cutting down time to market? This webcast will give you an overview of the true costs of building out and managing a HPC or Big Data environment and how commercial grade SDI software from IBM can provide a significant return on investment.
Tags : 
ibm, platform computing, software defined infrastructure, business technology
    
IBM
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology, it management, data management
    
IBM
Published By: Oracle     Published Date: Apr 05, 2018
Effective collaboration between marketing and IT continues to be crucially important. As we enter an era where our business agility will be defined by the flexibility of operation of our marketing tools, technology in marketing is a key focus. Read ‘The guide to building your marketing technology stack’ to learn more about how technology can enhance every area of marketing: • Rediscover the technology in marketing landscape • Explore the latest in marketing automation • See how content and social fit into your overall strategy Oracle is a global technology company who provide a range of services, including the Oracle, the ‘industry’s broadest and most integrated public cloud’ which lowers costs and reduces IT complexity. They also offer IT security service, database, Java services, Enterprise Management, consulting and support services and more. Oracle have more than 420,000 customers worldwide and are widely revered throughout the computing industry.
Tags : 
    
Oracle
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
    
Amazon Web Services
Start   Previous   1 2 3 4 5 6 7 8 9 10    Next    End
Search      

Add A White Paper

Email sales@inetinteractive.com to find out about white paper options for your company.