HIT Consultant Insightful coverage of healthcare innovation
 

it discovery

Results 151 - 175 of 211Sort Results By: Published Date | Title | Company Name
Published By: Proofpoint     Published Date: Apr 22, 2010
This whitepaper addresses the key e-discovery challenges facing legal and IT departments today, form the impact of regulations such as the Federal Rules of Civil Procedure (FRCP) to the technology required to deal with them. Finally, this paper describes how businesses can develop a proactive strategy to deal with e-discovery requests in the event of litigation, saving time and money in the process.
Tags : 
proofpoint, email archiving, e-discovery, federal rules of civil procedure, frcp, electronic data, search and retrieval of messages
    
Proofpoint
Published By: Proofpoint     Published Date: Apr 06, 2012
Gartner, Inc. positions Proofpoint in the Challengers quadrant in its 2011 Magic Quadrant for the Enterprise Information Archiving (email archiving, file archiving, social media archiving, retention management, eDiscovery, search, legal hold) market.
Tags : 
gartner, email archiving, archive email, eia magic quadrant, gartner, search, ediscovery, legal hold
    
Proofpoint
Published By: ProofSpace     Published Date: Jul 31, 2007
This paper details the processes by which ProofMark tags electronic records with a self-validating cryptographic seal that acts as a "tamper indicator" based on a true and provable time-reference datum.  With this it is able to provide instantaneous and irrefutable proof of authenticity, no matter where the data resides or who has controlled it.
Tags : 
proofspace, proofmark, proofspace, proofmark, signing key vulnerability, trustworthy time, authenticated time stamps, time existence of data
    
ProofSpace
Published By: ProofSpace     Published Date: Aug 07, 2007
This white paper presents a solution framework for Life Science Organizations that want to implement enterprise digital trust management to protect their electronic value chain.
Tags : 
proofspace, proofmark, signing key vulnerability, trustworthy time, authenticated time stamps, time existence of data, electronic postmark, time stamp
    
ProofSpace
Published By: ProofSpace     Published Date: Aug 08, 2007
Digital signatures aren't enough. This paper posits that data integrity needs to be redefined within the context of a time-sensitive mechanism.
Tags : 
proofspace, proofmark, signing key vulnerability, trustworthy time, authenticated time stamps, time existence of data, electronic postmark, time stamp
    
ProofSpace
Published By: ProofSpace     Published Date: Aug 17, 2007
This white paper defines the life cycle of an electronically signed record and describes  the requirements for handwritten signature-equivalence throughout its retention period.
Tags : 
proofspace, proofmark, signing key vulnerability, trustworthy time, authenticated time stamps, time existence of data, electronic postmark, asymmetric key cryptography
    
ProofSpace
Published By: ProofSpace     Published Date: Sep 10, 2007
Read this paper and learn the principles that are prerequisites to enforceable electronic agreements as required by existing legal standards and electronic signature legislation.  This paper will also specify sixteen measurement criteria that can be used as metrics to assess whether the architecture of an electronic transaction will meet the requirements of admissibility.
Tags : 
proofspace, proofmark, signing key vulnerability, trustworthy time, authenticated time stamps, time existence of data, electronic postmark, asymmetric key cryptography
    
ProofSpace
Published By: Qlik     Published Date: Aug 12, 2015
This Research Report examines the analytical strategies of organisations currently using data discovery tools and highlights their superior performance in user engagement.
Tags : 
data discovery, data, data discovery tools, analytics, benefits of using data discovery, data collaboration, data proliferation, data engagement
    
Qlik
Published By: Rackspace     Published Date: Apr 15, 2019
Scale events — like online sales and digital product launches — present great revenue opportunities, but they also present large risks to your business. Whether you are a retailer preparing for Black Friday and Cyber Monday, or a digital vendor launching a new service, your brand is both at its most visible and its most vulnerable during these scale events. Many more customers visit your site over a short period of time, raising the potential for resource constraints and discovery of software bugs. Information about issues spreads quickly via social media and news outlets. And, your customers typically spend more per transaction, so every lost order has a greater negative impact on your bottom line. Site reliability engineering (SRE) can help you better prepare for scale events through an iterative cycle of data-driven improvement.
Tags : 
    
Rackspace
Published By: Rand Secure Archive     Published Date: Jun 06, 2013
Without a doubt, technology has revolutionized the way business is done. The vast majority of internal and external communication is now done through email, instant messaging, VOIP and other types of electronic communication. In addition, most business documents exist in digital form. The digitization of business has had significant implications for litigation. Legal eDiscovery is now an essential part of settling cases between organizations. Previously, many companies believed that eDiscovery was only for organizations that were public or highly regulated, but now most companies recognize that electronically stored information (ESI) regulations are applicable to all organizations and there is an ever-increasing need for cost-effective eDiscovery.
Tags : 
coding, ediscovery, legal, effective, organization
    
Rand Secure Archive
Published By: Riverbed     Published Date: May 18, 2012
As your business needs become more dynamic, monitoring systems will be key to understanding how applications fulfill business needs and your ability to provide governance will determine the success level of your entire enterprise. With complexity growing at such a hyperbolic rate, in the future, automated application discovery and dependency mapping will not just be useful -it will be mandatory.
Tags : 
service management, management software, infrastructure, devops, cloud performance monitoring, application monitoring, database monitoring, end user experience monitoring
    
Riverbed
Published By: Riverbed     Published Date: Oct 12, 2012
Riverbed® Cascade® is an application-aware network performance management solution with built-in application discovery and dependency mapping capabilities. The result is dramatically shortened project times, reduced costs, and improved efficiencies.
Tags : 
application discovery, it planning, data consolidation, virtualization
    
Riverbed
Published By: Riverbed     Published Date: Oct 17, 2012
As your business needs become more dynamic, monitoring systems will be key to understanding how applications fulfill business needs. With complexity growing at a fast rate, application discovery and mapping will soon be mandatory.
Tags : 
riverbed, application visibility, infrastructure monitoring, it infrastructure
    
Riverbed
Published By: Riverbed     Published Date: Jul 17, 2013
Effective IT planning and decision making are dependent upon having a complete asset inventory and an accurate understanding about dependencies available before, during and after IT change activities. Application discovery and dependency mapping (ADDM) solutions automate the process of discovering and mapping transactions and applications to the underlying infrastructure and application interdependencies. Learn how Riverbed Cascade’s built-in application discovery and dependency mapping capabilities can help you shorten project times and mitigate risk.
Tags : 
effective it, addm, application discovery, riverbed cascade
    
Riverbed
Published By: Riverbed     Published Date: Sep 05, 2014
Information is becoming the currency of the 21st century, and organizations of all sizes are creating more information than ever before. Frequently, organizations struggle with how best to balance storage requirements with available budgets. The benefits of greater storage capacity are clear: better change management and roll-back, easier archival and discovery, reliable backup and recovery. But to continually grow on-site capacity requires unsustainable investment and introduces unnecessary risk. Prominent cloud service providers present an appealing alternative with their vast amounts of storage, offering low cost, high availability, and built-in redundancy. Riverbed’s Whitewater® family of cloud storage gateways is a drop-in replacement for existing tape- and diskbased backup targets, cuts backup costs 30-50% compared to tape and disk, and provides effective DR capability with minimal effort.
Tags : 
data security, cloud storage, gateways, requirements, archival, discovery, management
    
Riverbed
Published By: Riverbed     Published Date: Feb 15, 2018
To stay ahead in today's hybrid network, you need a lens into the end user's experience as well as an understanding of the dependencies between your applications and network. With this approach, you are alerted to issues before the business is impacted and problems are resolved faster. This eBook details what you need to know to select a best of breed network performance management solution and outlines the critical capabilities required for deep application visibility across virtualized, hybrid and cloud networks no matter where a user is located. Read this book and: Discover best practices - for proactive network monitoring and fast troubleshooting Learn how to stay ahead of application performance issues with increased visibility Increase productivity and a higher ROI - with automatic discovery, end-to-end monitoring, reporting, analytics and faster MTTR Ensure your approach is a proactive mode Study the evaluation checklist - select the best solution for your needs
Tags : 
    
Riverbed
Published By: Riverbed     Published Date: Jan 25, 2018
To stay ahead in today's hybrid network, you need a lens into the end user's experience as well as an understanding of the dependencies between your applications and network. With this approach, you are alerted to issues before the business is impacted and problems are resolved faster. This eBook details what you need to know to select a best of breed network performance management solution and outlines the critical capabilities required for deep application visibility across virtualized, hybrid and cloud networks no matter where a user is located. Read this book and: Discover best practices - for proactive network monitoring and fast troubleshooting Learn how to stay ahead of application performance issues with increased visibility Increase productivity and a higher ROI - with automatic discovery, end-to-end monitoring, reporting, analytics and faster MTTR Ensure your approach is a proactive mode Study the
Tags : 
    
Riverbed
Published By: Rovi     Published Date: Apr 08, 2013
An Introduction to Rovi Insight Series Volume 1
Tags : 
rovi, over-the-top, ott, entertainment metadata, digital media, digital discovery, entertainment discovery, recommendations engine
    
Rovi
Published By: Rovi     Published Date: Apr 08, 2013
An introduction to Rovi Insights series Mobile World Congress 2013 Edition.
Tags : 
rovi, mobile, multi-screen, high efficiency video coding, hevc, entertainment metadata, digital media, digital discovery
    
Rovi
Published By: SAS     Published Date: Apr 25, 2017
Business intelligence has come a long way ? from assistance with report generation to self-service platforms for discovery and analytical insight. As technological capabilities and business aptitude with information continue to advance, the next generation of BI will be even more capable and valuable to the enterprise. To discuss today’s success factors and tomorrow’s opportunities, IIA spoke with Andy Bitterer, Senior Director of Business Intelligence Product Management at SAS, and Tapan Patel, Principal Product Marketing Manager at SAS.
Tags : 
    
SAS
Published By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: SAS     Published Date: Mar 06, 2018
Business intelligence has come a long way ? from assistance with report generation to self-service platforms for discovery and analytical insight. As technological capabilities and business aptitude with information continue to advance, the next generation of BI will be even more capable and valuable to the enterprise. To discuss today’s success factors and tomorrow’s opportunities, IIA spoke with Rick Styll, Senior Manager, Visual Analytics Product Management at SAS, and Tapan Patel, Principal Product Marketing Manager at SAS.
Tags : 
    
SAS
Published By: SAS     Published Date: Mar 06, 2018
Known for its industry-leading analytics, data management and business intelligence solutions, SAS is focused on helping organizations use data and analytics to make better decisions, faster. The combination of self-service BI and analytics positions you for improved productivity and smarter business decisions. So you can become more competitive as you use all your data to take better actions. Instead of depending on hunch-based choices, you can make decisions that are truly rooted in discovery and analytics. And you can do it through an interface that anyone can use. At last, your business users can get close enough to the data to manipulate it and draw their own reliable, fact-based conclusions. And they can do it in seconds or minutes, not hours or days. Equally important, IT remains in control of data access and security by providing trusted data sets and defined processes that promote the valuable, user-generated content for reuse and consistency. But, they are no longer forced
Tags : 
    
SAS
Published By: SAS     Published Date: Mar 06, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics, and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. With the right end-user tools, a data lake can enable the self-service data practices that both technical and business users need. These practices wring business value from big data, other new data sources, and burgeoning enterprise da
Tags : 
    
SAS
Published By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Start   Previous    1 2 3 4 5 6 7 8 9    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.