transactional data

Results 1 - 25 of 45Sort Results By: Published Date | Title | Company Name
Published By: StrongMail     Published Date: Jun 08, 2008
The growing trend towards insourcing marketing and transactional email is being driven by businesses that are looking for ways to improve their email programs, increase data security and lower costs. When evaluating whether it makes more sense to leverage an on-premise or outsourced solution, it's important to understand how the traditional arguments have changed.
Tags : 
strongmail, social media, email marketing, on-premise advantage, transactional email, insourcing, bounce management, spam trappers, emarketing, roi, networks, social networking
    
StrongMail
Published By: SAP     Published Date: May 18, 2014
This white paper discusses the issues involved in the traditional practice of deploying transactional and analytic applications on separate platforms using separate databases. It analyzes the results from a user survey, conducted on SAP's behalf by IDC, that explores these issues.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: SAP     Published Date: Nov 16, 2017
The SAP HANA platform has successfully become a proven mainstay data management solution, supporting the full range of analytic and transactional software from SAP, both in the data center and in the cloud.
Tags : 
    
SAP
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Published By: Dell PC Lifecycle     Published Date: Aug 13, 2018
Your business may need to keep track of dozens of different initiatives—but that doesn’t mean you need dozens of separate storage solutions to get the job done. To reduce complexity, your business may consider storage solutions that can take care of multiple jobs at once without sacrificing performance. For example, if you operate a brick-and-mortar store and an online store, you should be able to retrieve customer data from both sources without compromising transactional database performance. The all-flash Dell EMC™ SC5020 storage array aims to be just such a solution.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Aug 13, 2018
The all-flash Dell EMC SC5020 storage array handled transactional database workloads and data mart imports better than an HPE solution without sacrificing performance. With the all-flash Dell EMC™ SC5020 storage array, your company could attend to more customer orders each minute and save time while simultaneously importing data.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Aug 13, 2018
Your business may need to keep track of dozens of different initiatives—but that doesn’t mean you need dozens of separate storage solutions to get the job done. To reduce complexity, your business may consider storage solutions that can take care of multiple jobs at once without sacrificing performance. For example, if you operate a brick-and-mortar store and an online store, you should be able to retrieve customer data from both sources without compromising transactional database performance. The all-flash Dell EMC™ SC5020 storage array aims to be just such a solution.
Tags : 
    
Dell PC Lifecycle
Published By: Oracle     Published Date: Aug 02, 2018
T raditional backup systems fail to meet the needs of modern organizations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements. Additionally, highly regulated industries, such as financial services, are subject to ever?increasing regulatory mandates that require stringent protection against data breaches, data loss, malware, ransomware, and other risks. These risks require fiduciary?class data recovery to eliminate data loss exposure and ensure data integrity and compliance. This book explains modern database protection and recovery challenges (Chapter 1), the important aspects of a database protection and recovery solution (Chapter 2), Oracle’s database protection and recovery solutions (Chapter 3), and key reasons to choose Oracle for your database protection and recovery needs (Chapter 4).
Tags : 
    
Oracle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
In the end, the Dell EMC VMAX 250F with Intel® Xeon® Processor All Flash storage array lived up to its promises better than the HPE 3PAR 8450 Storage array did. We experience minimal impact to database performance when the VMAX 250F processed transactional and data mart loading at the same time. This is useful whether you're performing extensive backups or compiling large amounts of data from multiple sources. Intel Inside®. New Possibilities Outside.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Prevent unexpected downtime with reliable failover protection. We interrupted access both local storage arrays - the Dell EMC database host seamlessly redirected all I/O to remote VMAX 250F with Intel® Xeon® Processor via SRDF/Metro with no interruption of service or downtime. The 3PAR solution crashed until the standby paths became active and we restarted the VM.
Tags : 
    
Dell PC Lifecycle
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : 
data. queries, database operations, transactional databases, clustering, it management, storage, business technology
    
IBM
Published By: IBM     Published Date: May 17, 2016
Analyst Mike Ferguson of Intelligent Business Strategies writes about the enhanced role of transactional DBMS systems in today's world of Big Data. Learn more about how Big Data provides richer transactional data and how that data is captured and analyzed to meet tomorrow’s business needs. Access the report now.
Tags : 
ibm, intelligent business solutions, big data, transaction data, business analytics
    
IBM
Published By: IBM     Published Date: Jul 05, 2016
This white paper is written for SAP customers evaluating their infrastructure choices, discussing database technology evolution and options available.
Tags : 
ibm, always on business, analytics, database, database platform, sap, blu, db2, networking, knowledge management, storage, data management, business technology
    
IBM
Published By: Workday     Published Date: Jan 17, 2019
At Workday, we take a unique approach to development. We build one technology platform, on a single codeline, giving you a single security model and one source of truth. See how our singular methodology delivers real-time transactional data everyone can trust.
Tags : 
methodology, security model, technology
    
Workday
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: KPMG     Published Date: Jul 10, 2018
Working out what consumers want – and why – is getting harder. Transactional data and traditional market research and demographic profiles no longer do the job. Our ‘Five Mys’ report proposes a radical new framework for navigating complex consumer decision-making. Read the report to find out: • what the ‘Five Mys’ are and how they affect spending decisions • how to get better at predicting consumers’ changing needs • where different generations are directing their spending • how changing life patterns are creating new opportunities for businesses that can pick up on signals from consumers
Tags : 
    
KPMG
Published By: Datastax     Published Date: May 14, 2018
" For any business that wants to successfully compete in today’s digital economy, it is not a question of if but rather how much of their business will be done with cloud applications. A cloud application is one with many endpoints including browsers, mobile devices, and/or machines that are geographically distributed. The application is intensely transactional (high velocity reads and/or writes), always available, and instantaneously responsive no matter the number of users or machines using the application. Download this free white paper and explore how DataStax customers are delivering real-time value at epic scale with their cloud applications. Explore the core database requirements that make businesses successful with cloud applications, which include continuous availability, linear scale, and geographic distribution."
Tags : 
    
Datastax
Published By: Datastax     Published Date: Aug 27, 2018
" For any business that wants to successfully compete in today’s digital economy, it is not a question of if but rather how much of their business will be done with cloud applications. A cloud application is one with many endpoints including browsers, mobile devices, and/or machines that are geographically distributed. The application is intensely transactional (high velocity reads and/or writes), always available, and instantaneously responsive no matter the number of users or machines using the application. Download this free white paper and explore how DataStax customers are delivering real-time value at epic scale with their cloud applications. Explore the core database requirements that make businesses successful with cloud applications, which include continuous availability, linear scale, and geographic distribution."
Tags : 
    
Datastax
Published By: Oracle     Published Date: May 03, 2017
Traditional backup systems fail to meet the needs of modern organisations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements.
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Traditional backup systems fail to meet the needs of modern organizations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements. Additionally, highly regulated industries, such as financial services, are subject to ever?increasing regulatory mandates that require stringent protection against data breaches, data loss, malware, ransomware, and other risks. These risks require fiduciary?class data recovery to eliminate data loss exposure and ensure data integrity and compliance
Tags : 
    
Oracle
Published By: Pentaho     Published Date: Mar 08, 2016
If you’re evaluating big data integration platforms, you know that with the increasing number of tools and technologies out there, it can be difficult to separate meaningful information from the hype, and identify the right technology to solve your unique big data problem. This analyst research provides a concise overview of big data integration technologies, and reviews key things to consider when creating an integrated big data environment that blends new technologies with existing BI systems to meet your business goals. Read the Buyer’s Guide to Big Data Integration by CITO Research to learn: • What tools are most useful for working with Big Data, Hadoop, and existing transactional databases • How to create an effective “data supply chain” • How to succeed with complex data on-boarding using automation for more reliable data ingestion • The best ways to connect, transport, and transform data for data exploration, analytics and compliance
Tags : 
data, buyer guide, integration, technology, platform, research
    
Pentaho
Published By: Epsilon     Published Date: Oct 19, 2012
Unlock the Mystery of Predictive Modeling -- Retailers that use customer transactional data to their advantage, coupled with other database elements, can learn how to build solid customer relationships and strengthen their ROI.
Tags : 
modeling, epsilon, marketing, marketing research, roi, transactional data
    
Epsilon
Published By: IBM     Published Date: Aug 31, 2012
As those hosting transactional data on System z today already realize, it is one of the most secure, highly available and reliable platforms on the market - now see how organizations are bringing their analytics to IBM System z to better harness their data and gain greater business insights while containing cost and reducing complexity.
Tags : 
    
IBM
Previous   1 2    Next    
Search      

Add A White Paper

Email sales@inetinteractive.com to find out about white paper options for your company.