HIT Consultant Insightful coverage of healthcare innovation
 

app volumes

Results 26 - 50 of 51Sort Results By: Published Date | Title | Company Name
Published By: IBM     Published Date: Jul 08, 2016
In our 21-criteria evaluation of the dynamic case management (DCM) market, we identified the 14 most significant software vendors — Appian, bpm’online, Column Technologies, DST Systems, Eccentex, IBM, Isis Papyrus, Lexmark Enterprise Software, MicroPact, Newgen Software, OnBase by Hyland, OpenText, Pegasystems, and TIBCO Software — and researched, analyzed, and scored them. The evaluation focused on providers’ adaptive, analytics, and mobile features, all critical to helping enterprises tackle increasing volumes of varied and unstructured work. This report helps enterprise architecture (EA) professionals select the best providers to meet their unique needs.
Tags : 
ibm, forrester, forrester wave, dynamic case management, dcm
    
IBM
Published By: Group M_IBM Q2'19     Published Date: Apr 03, 2019
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
    
Group M_IBM Q2'19
Published By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
flash pool, fas storage systems, ssd, online transaction processing, cluster storage
    
NetApp
Published By: Akamai Technologies     Published Date: Apr 25, 2018
Cyber attackers are targeting the application programming interfaces (APIs) used by businesses to share data with customers. Consumer mobile adoption, electronic goods and services, and high volumes of data have led businesses to use APIs for data exchange. Unfortunately, attackers can also use APIs to access or deny service to valuable data and systems. This white paper explores strategies for protecting APIs. You’ll learn about APIs, how and why these endpoints are targets for web application attacks, security models, and how Akamai can help.
Tags : 
api, security, interface, businesses, data, mobile, adoption
    
Akamai Technologies
Published By: IBM     Published Date: Nov 03, 2017
Massive shifts within the digital business landscape are sparking immense opportunities and reshaping every sector. In some cases, complete upheaval is happening at lightning-fast speed. In other instances, digital undercurrents are stirring beneath the surface as organizations scramble to monetize vast volumes and variety of data in an effort to sharpen their competitive edge and not be blindsided by unforeseen events that completely upend existing business models. While long-standing industry leadership might be no match for the next cool app, agility, speed and the ability to harness more data than was ever imagined is fueling powerful possibilities for reinvention among companies of every size. Data is following rapidly from mobile devices and social networks, as well as from every connected product, machine and infrastructure. This data holds the potential for deep insights that can replace guesswork and approximations as to locations, behaviors, patterns and preferences. As the w
Tags : 
digital business, data, data-driven enterprise, innovation, ibm
    
IBM
Published By: IBM     Published Date: Jul 05, 2018
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
    
IBM
Published By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
    
Group M_IBM Q418
Published By: Oracle     Published Date: Oct 20, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make bette
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make bette
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
In today’s IT infrastructure, data security can no longer be treated as an afterthought, because billions of dollars are lost each year to computer intrusions and data exposures. This issue is compounded by the aggressive build-out for cloud computing. Big data and machine learning applications that perform tasks such as fraud and intrusion detection, trend detection, and click-stream and social media analysis all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Companies increasingly need to drive the speed of business up, and organizations need to support their customers with real-time data. The task of managing sensitive information while capturing, analyzing, and acting upon massive volumes of data every hour of every day has become critical. These challenges have dramatically changed the way that IT systems are architected, provisioned, and run compared to the past few decades. Most compani
Tags : 
    
Oracle
Published By: VMware     Published Date: Sep 06, 2017
"By implementing the VMware App Volumes™ application management solution, OGL built a multitenant VDI environment that delivers an extensive range of applications to customers in real time, with full application functionality on any device, while also reducing OpEx and CapEx. Download this case study to learn how OGL Computer addressed these issues and more with VMware App Volumes: • Scale VDI service for more customers and applications. • Reduce expenses of Citrix VDI solution delivery. • Deliver and manage applications dynamically, in real time. • Give users desktop access on any Internet-connected device. "
Tags : 
    
VMware
Published By: VMware     Published Date: Dec 05, 2017
"Are you planning a migration off Citrix XenApp or XenDesktop? Or looking for ways to make them run as good as you thought they should? Then this is the session for you. Come hear from the architect of VMware's Citrix migration and enhancement strategies as he explains how key VMware technologies such as Horizon, App Volumes, Instant Clone and NSX can help. We will dive deep on how to evaluate the state of the Citrix environment, assess project requirements and review real world scenarios. By the end of the session you will know all about the best-practices, tools and services available to ensure your project is a complete success. Download the webinar to learn how VMware technologies can help solve your Citrix problems!"
Tags : 
vmware, citrix, app volumes
    
VMware
Published By: Cohesity     Published Date: Aug 09, 2019
The growing importance—and complexity—of data protection means old approaches no longer will get the job done in an era of exploding data volumes and ever-changing business requirements. It’s time to reimagine and reengineer your IT infrastructure for a more efficient, affordable and manageable data protection framework.
Tags : 
    
Cohesity
Published By: Marketo     Published Date: Jun 04, 2018
Until now, most organizations have built their MarTech stacks piece by piece—adding technologies to meet new needs or test new delivery methods. But with so many applications available and so many competitive gains to be made by using the right combination of technologies, it is increasingly important to have a strategy to connect your stack across marketing and other revenue-driving functions. By harnessing a well-considered MarTech stack, marketers can bring order to the overwhelming volumes of data they collect from online and offline interactions with prospective and existing customers. Better yet, this technology enables you to make those insights actionable, meaning you can make informed decisions. Download this ebook for seven practical steps your organization can take to put together a high-performance MarTech stack.
Tags : 
    
Marketo
Published By: NetApp     Published Date: Aug 19, 2010
You'll learn about the best storage configurations for Hyper-V, NetApp's virtualization "sweet spot," and Avanade's firsthand experience with two much-anticipated R2 features: Cluster Shared Volumes and Live Migration.
Tags : 
netapp, hyper-v r2, virtualization, cluster shared, live migration, windows server intergration
    
NetApp
Published By: IBM     Published Date: May 28, 2014
Different types of data have different data retention requirements. In establishing information governance and database archiving policies, take a holistic approach by understanding where the data exists, classifying the data, and archiving the data. IBM InfoSphere Optim™ Archive solution can help enterprises manage and support data retention policies by archiving historical data and storing that data in its original business context, all while controlling growing data volumes and improving application performance. This approach helps support long-term data retention by archiving data in a way that allows it to be accessed independently of the original application.
Tags : 
ibm, data retention, information governance, archiving, historical data, integrating big data, governing big data, integration, best practices, big data, ibm infosphere, it agility, performance requirements, hadoop, scalability, data integration, big data projects, high-quality data, leverage data replication, data persistence
    
IBM
Published By: IBM     Published Date: Jul 19, 2016
Data movement and management is a major pain point for organizations operating HPC environments. Whether you are deploying a single cluster, or managing a diverse research facility, you should be taking a data centric approach. As data volumes grow and the cost of compute drops, managing data consumes more of the HPC budget and computational time. The need for Data Centric HPC architectures grows dramatically as research teams pool their resources to purchase more resources and improve overall utilization. Learn more in this white paper about the key considerations when expanding from traditional compute-centric to data-centric HPC.
Tags : 
ibm, analytics, hpc, big data
    
IBM
Published By: SAS     Published Date: Jan 04, 2019
As the pace of business continues to accelerate, forward-looking organizations are beginning to realize that it is not enough to analyze their data; they must also take action on it. To do this, more businesses are beginning to systematically operationalize their analytics as part of a business process. Operationalizing and embedding analytics is about integrating actionable insights into systems and business processes used to make decisions. These systems might be automated or provide manual, actionable insights. Analytics are currently being embedded into dashboards, applications, devices, systems, and databases. Examples run from simple to complex and organizations are at different stages of operational deployment. Newer examples of operational analytics include support for logistics, customer call centers, fraud detection, and recommendation engines to name just a few. Embedding analytics is certainly not new but has been gaining more attention recently as data volumes and the freq
Tags : 
    
SAS
Published By: Esker     Published Date: Dec 08, 2014
Approximately 79% of customers seeking customer service and support wish it were easier and more convenient to obtain.* Read the new Customer Service Workbook to discover industry tips and best practices for bringing efficiency and profitability to order management while freeing up your CSRs to: • Process orders quicker with fewer errors • Carry out customer engagement strategies • Manage order volumes at current staffing levels • Increase customer loyalty
Tags : 
customer service, csr, outbound calls, self-service, strategies, automation, electronic workflow, visibility, streamline, social media
    
Esker
Published By: Neterion     Published Date: Dec 05, 2006
The relentless growth of data and network-intensive applications such as digital imaging, multimedia content, and broadcast/video continues to drive volumes of enterprise data and network traffic. As growth continues, IT managers are challenged with implementing solutions without interrupting critical business processes.
Tags : 
network infrastructure, traffic management, bandwidth management, bandwidth, network management, neterion
    
Neterion
Published By: SAS     Published Date: Apr 20, 2017
Hype and hope — Big Data has generated a lot of both. Thanks to an abundance of enterprise information systems, networks, applications and devices that churn out huge volumes of information, government agencies are awash in Big Data. Add to this data growth the emerging trend of the Internet of Things (IoT) — the network of people, data, things and processes that is increasingly linked through automated connections and sensors — and the future of Big Data can seem quite daunting.
Tags : 
    
SAS
Published By: NetApp     Published Date: Dec 08, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate.
Tags : 
netapp, hybrid, flash pool, ssd, hdd, iops, oltp, demartek
    
NetApp
Published By: Cyveillance     Published Date: Jun 20, 2014
Phishing is defined by the Financial Services Technology Consortium (FSTC) as a broadly launched social engineering attack in which an electronic identity is misrepresented in an attempt to trick individuals into revealing personal credentials that can be used fraudulently against them. In short, it’s online fraud to the highest degree. Although it’s been around for years, phishing is still one of the most common and effective online scams. The schemes are varied, typically involving some combination of spoofed email (spam), malicious software (malware), and fake websites to harvest personal information from unwitting consumers. The explosive rise of mobile devices, mobile applications, and social media networks has given phishers new vectors to exploit, along with access to volumes of personal data that can be used in more targeted attacks or spear phishing. The fact that phishing attacks are still so common highlights their efficacy and reinforces the need to implement comprehensive phishing and response plans to protect organizations. An effective phishing protection plan should focus on four primary areas: Prevention, Detection, Response, and Recovery. High-level recommendations for each of the four areas are outlined in this whitepaper.
Tags : 
cyveillance, phishing, security, cyberattacks, cybercriminals, threats
    
Cyveillance
Published By: Hewlett Packard Enterprise     Published Date: Jul 18, 2018
Enterprises are moving to the cloud because it offers greater flexibility and lower cost. The first wave of cloud native and content apps are well on their way, but there's a new wave of traditional enterprise applications that enterprises are still struggling to move to the cloud. The Q&A walks you through the most common issues related to cloud storage—and how HPE Cloud Volumes put any remaining concerns to rest.
Tags : 
    
Hewlett Packard Enterprise
Published By: CrowdStrike     Published Date: Feb 01, 2017
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. Download the white paper to learn: • How to detect known and unknown threats by applying high-volume graph-based technology, similar to the ones developed by Facebook and Google • How CrowdStrike solved this challenge by building its own proprietary graph data model • How CrowdStrike Threat Graph™ collects and analyzes massive volumes of security-related data to stop breaches
Tags : 
    
CrowdStrike
Start   Previous    1 2 3    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.