What’s Minimal Downtime and Disaster Recovery Worth to Your Business

When determining a budget for your company’s Disaster Recovery and Business Continuity plans, you must consider the impacts of infrastructure downtime. Understanding the costs associated with productivity loss, customer satisfaction concerns, employee morale, potential revenue loss and the possibility of losing clients will allow you to set a budget that protects your business.

Determining the Cost of Downtime

The most accurate way to determine the cost of downtime is to determine how many hours your company experienced downtime in a set period of time. Once you have collected the total hours of downtime, consider the number of offices and the number of employees you have, along with the average hourly salary. This will allow you to calculate the hourly cost of disruptions to business operations.

It’s imperative to consider all three factors – number of offices; number of employees, and average hourly salary – when determining the cost of downtime. An interruption to business continuity affects employees within your entire organization as they will be unable to access vital business applications which will contribute to a sizeable loss to productivity.

Determine Your Company’s Forbearance to Data Loss

The next step is to analyze your business’ compliance requirements and to compare them with the downtime tolerances as stated by the law. If your industry is in the healthcare, financial, legal or government industries, you will need to look through the regulatory requirements. HIPA and PIPEDA have implemented specific guidelines with regards to how company data must be stored and made available, and it is imperative to comply with these requirements to avoid the risk of legal complications.

Your company compliance policies on digital data protection will also help you understand your organization’s forbearance to data loss. These policies often state what data retention policies must be in place, and are key to protecting your company’s valuable information.

Additionally, all laws regarding file access controls, sharing controls and retention settings should be clearly defined in your compliance policies.

Determine Your Company’s Recovery Goals

Using the information collected from the two previous steps, determine recovery objectives that can be executed in the event of a disaster. These objectives will help you determine the necessary recovery procedures and timeframes to minimize downtime.

It’s crucial that your disaster recovery plan includes a Recovery Time Objective (RTO) and a Recovery Point Objective (RPO).  The RTO details the maximum time allotted to recover systems after a disaster event occurs, whereas the RPO pertains to the maximum permitted age of the data when recovering your operative system, or the maximum data loss recorded by time.

Importance of a Quality Disaster Recovery Plan

With a superior disaster recovery plan put in place, your company will experience minimal losses in revenue and productivity in the case of a disaster event occurring. The budget you allocate to your plan must be based on your company’s Recovery Time Objective (RTO) and the Recovery Point Objective (RPO) to ensure an optimal plan of action to minimizing the impacts of downtime, and the risk of legal and financial issues.

To learn more about determining the risks associated with your IT infrastructure crashing and how to create a powerful disaster recovery plan, speak with a Zycom specialist today. Zycom offers premium services that enable your business to focus on key competencies, while our experts focus on ensuring a smooth transition of your data to the cloud.

Cloud Solutions

Are You Ready for Cloud?

Adapting to the digital age is imperative to the success of any business. “The cloud”, as the tech industry as named it, refers to software and services that run on the internet as opposed to locally on a computer. It’s advanced technology that improves IT maintenance, performance, scalability, and efficiency, while also reducing IT costs. However, switching your business over to the cloud isn’t as simple as signing into your web browser. First, there are some steps to take to prepare your IT infrastructure for this digital transformation.

Assessing Your Business

Before migrating to the cloud, you must ensure your IT infrastructure is properly prepared for the migration to avoid any delays or discrepancies during the migration process. This includes assessing your environment and business needs to determine what the best cloud solution provider is for your business.

There are private cloud solutions, public and hybrid, all of which offer unique benefits, usability and security. Private clouds are often the premium option for businesses as they offer superior security and data storing capabilities.

Now, in terms of storage space, the cloud is highly recognized for its scalability, offering different options based on your specific needs. Though, it all starts with first, assessing your business and determining how you’ll be using the cloud.

Different Cloud Uses and Benefits

Migrating your business over to the cloud will grant you with a plethora of advantages. Not only is the cloud economic and convenience, but it can also improve business operations, productivity, and security. Depending on the cloud solution provider you choose, you’ll also receive additional benefits.

Big Data Analytics

Collecting data is a key component of many business’ success. Companies such as Facebook and Amazon rely on collective massive amounts of data to determine valuable information on their customers. You can do the similar with the cloud, making it easier to make data-based decisions regarding crucial components to your success, such as marketing and sales.

Software-Defined Wide Area Networking (SD-WAN)

SD-WAN clouds integrate with your current network to enhance business operations and connectivity, while also addressing major issues you may experience without a cloud solution.

Virtual Desktops (VDI) and Desktop as a Service (DaaS)

VDI and DaaS cloud solutions are particularly beneficial for workplaces where employees bring their own devices, whether it’s a laptop, tablet or smartphone. This cloud solution provider makes it easy to standardize security and the type of content that can be accessed on each device.

Infrastructure as a Service (IaaS)

With IaaS, you can forgo major expenses associated with building and maintaining infrastructures. With this cloud use, you’re able to host data in a cloud solution provider’s data center.

Email

Something as simple as emails can be migrated over to the cloud, freeing up valuable space on your devices to increase speed, performance and flexibility. With an email cloud solution provider, important data can be shared with the entire team, sans an abundance of forwarded emails.

Disaster Recovery as a Service (DRaaS)

DRaaS is one of the core cloud uses, as disaster recovery is imperative to every business. Having the ability to quickly recover data and get your business back up and running can reduce downtime in the event of a disaster.

Backup as a Service (BaaS)

It’s recommended that every business should invest in the 3-2-1 rule – having multiple copies of your data on two types of media and 1 offsite cloud solution. BaaS solutions provide a safe and secure backup of your data, enabling quick restoration in the event of a disaster.

Private Cloud, Public Cloud or Hybrid Cloud

Cloud solution providers offer data hosting on three types of environments –  a private cloud, a public cloud or a hybrid cloud. Understanding the differences between the three types of clouds is imperative to finding the right cloud solution provider for your business.

As a quick reference, a public cloud solution provider makes data available to the public through the Internet, such as Google Docs. Private cloud solutions are run on a private internal network or internet, and hybrid cloud solutions are a combination of the two.

Software as a Service (SaaS)

SaaS technology is a powerful tool for storing, organizing and maintaining data. The automation tools and customer relationship management services that come with SaaS can improve your business’ efficiency.

Finding the Best Cloud Solution Provider

Determining which cloud solution provider is best for your business depends on a variety of factors. In addition to going over the different cloud uses mentioned previously, some things to consider include:

  • Security
  • Compliance
  • Manageability
  • Service Levels
  • Costs
  • Platform/mobile compatibility
  • Scalability
  • Container capabilities

 

Without proper preparation, migrating to the cloud could negatively affect performance and efficiency. Contact Zycom Technology today for more information on the process of cloud migration and how to utilize this advanced technology to further your business.

An IT Health Check Can Keep the Blue or Purple Screen of Death Away

As technology advances, it’s imperative that your digital infrastructure and business operations follow in suit.  Implementing Hyper-Converged Infrastructure (HCI) is a realistic option for all organizations interested in improving their IT datacenter while also digitalizing and preparing their business for future technology.

Preparing to Migrate

If you are considering moving to HCI, baselining your current infrastructure against best practices to maximize success for migration to your new environment is key.  Understanding what components of your current software and driver stacks will move with your business workloads to HCI are important.  Are there opportunities to right size or optimize existing applications, databases and operating environments to leverage HCI best practices?

You have HCI but are you supportable?

Whilst, HCI offers a simplified enterprise cloud infrastructure that collapses data center computing, storage, management and networking, simplicity can lead to complexity.  HCI solutions, such as Nutanix, automate much of the previously complex data center management tasks into 1-click operations.  For automation to flourish there are dependencies between your commoditized hardware and drivers and hypervisor and software that must be taken into account.  In a landscape of automation, the finer details of compatibility and supportability can be easily overlooked.

If you are like most of our customers, you are running and supporting your business and the law of Murphy did not allow you to keep up on everything HCI once installed.  Your HCI just works, but time has passed since you last updated your HCI firmware, software or hypervisor.  Are you still in a vendor supported operational state?  Are there new features that will enhance your HCI solution?  Are you multiple versions behind and need to catch up?

 

Enter the HCI Health Check to get you baselined and back on track with a plan.

Contact Us

How a Healthy HCI Can Improve Infrastructure

Before upgrading your data center with HCI software, it’s crucial to execute a health check to make sure everything is aligned properly for success.  What features will enhance your virtual workloads and is there any impact during the process?  Known issues may pop up from time to time that you may or may not have experienced, but it’s always better to be cautious.

Getting current with compatible versions will give you peace of mind before launching automated upgrades, knowing that your business’ most valuable data is capable and safe.

Zycom HCI Health Check Services

Zycom offers HCI health checks that can help your organization prepare your HCI solution stack for upgrades. The service identifies any gaps and areas that need improvements and provides you with recommendations to ensure optimal performance.

Here are some things you’ll want to determine in the meantime before making any changes:

  • Which version of VDI broker is supported with your version of hypervisor?
  • Is your hypervisor supported by an HCI solution?
  • If so, which version of HCI solution is compatible with your hypervisor?
  • Which hardware firmware is supported for your type of HCI solution?
  • Which network drivers are supported for your HCI solution?

Zycom will perform a baseline review and health check service of the major components of the hyper-converged solution based on the industry’s best practices. This ensures your software is ready to upgrade to HCI in order to maximize performance and increase agility in your business.

Contact Us

What is Hyper-Converged Infrastructure?

In order to understand the benefits of a Hyper-Converged Infrastructure (HCI), you must first understand how it compares to traditional alternatives. In a standard data center, traffic is directed to a collection of servers that accommodate virtual machines. These servers need to access the storage area network through the storage controller before retaining or storing data.

An HCI offers a more manageable infrastructure as it brings together separate elements into a unified whole – the hypervisor, server, and storage are combined into one single node. This removes the storage area network completely, and the infrastructure is controlled centrally by one piece of software.

In conclusion, traditional data centers are a series of servers spread out, and an HCI is a software-defined storage that brings everything together in one node. They deliver and simplify storage, computing, networking, management, virtualization, and data protection into one unified storage, offering processes and tools for an efficient and more manageable IT infrastructure.

Benefits of Upgrading to an HCI Infrastructure

When you upgrade to an HCI enterprise cloud solution, you receive more out of your business operations for less. The increased simplicity and reliability that comes with an HCI infrastructure, combined with the minimal recurring costs, hyper-converged systems provide a pivotal point for all types of businesses that come with additional benefits:

Increased Simplicity for Superior Business Operations

Utilizing an HCI data center can enhance dexterity throughout your business operations. An HCI infrastructure makes it easier to manage your data center as it provides you with a stack of software in one node. The storage nodes act as one redundant collection of storage and should one go down, the rest will be unaffected. This creates a highly-reliable data center that offers immense simplicity, and with increased simplicity comes better automation and interconnection compatibility and dependency.

Easier Collaborations and Migration of Data

With an HCI infrastructure, all of your workloads fall under the same managerial umbrella. This simplifies the process of transporting data to different locations, essential for digitalized businesses who rely on collaborations and teamwork.

Increased Reliability and Data Protection

HCI data centers also streamline the process of adding and removing nodes to correspond with your resource demands. Since HCIs are node-based infrastructures, this makes it easy to scale up your data center.

Additionally, hyper-converged systems simplify the process of restoring data, offering you enhance data protection throughout your business.

Economic Solution for any IT Department

There is less equipment to purchase, support and maintain with an HCI data center, making it an affordable solution for any IT department. The recurring costs are minimal, yet the benefits are greater than the standard data center.

Contact Us

VMs versus Containers

The way you operate your computing systems is imperative to the overall efficiency, productivity and success of your business. With many major tech companies investing in containers, you’ve likely been wondering if VMs are up to par. While containers are greatly popular, virtual machines are still a plausible option. Though, both VMs and containers present their own unique benefits and drawbacks. In order to determine which option is best for your enterprise, you must first understand the pros and cons and major differences between VMs and containers.

What are VMs?

Virtual Machines (VMs) is a simulation of a computing system. VMs make it possible to run the various computing applications that appear to be many different computers on a hardware, all on one computer. Despite being all on one computer, each VM has its own operating system and the hardware is then virtualized. This results in a large amount of system resources being used but results in an economical advantage as opposed to running individual computers.

Benefits of VMs

Companies who have undergone or plan to execute a digital transformation need to visualize their server in order to operate to full potential. From cloud computing services to virtualization technology, VMs allow companies of all sizes to increase efficiencies while lowering costs by embracing VMs.

Additional benefits of embracing VMs in your IT infrastructure include:

  • Ability to establish management tools
  • All operating system resources are available to applications
  • Ability to establish security tools
  • Familiar security controls

What are Containers

Containers are a growing trend within the computing industry. Instead of virtualizing the computing systems within a computer, containers only virtualize the operating system. Each container shares the binaries, libraries and host OS kernel, all of which are read-only. This immediately reduces the amount of system resources that are used. In essence, you’re able to get more out of your operations, by using less.

Benefits of Containers

Sharing operating resources, such as the ones mentioned previously, decreases the need to recreate the OS code. This allows for a server to run various workloads all with one single installation, resulting in time and money saved. Containers can also be used to create portable and consistent environments for development, testing, and deployment.

Since containers share the OS system, they are significantly lighter than VMs and take only a couple of seconds to boot up.

Additional benefits of embracing containers in your IT infrastructure include:

  • Reduced size
  • Decreased IT management resources
  • Faster start-up
  • Simplified security updates
  • Less coding is required to transfer, migrate and upload workloads

What are the Major Differences Between VMs and Containers

VMs notoriously take up a large volume of system resources, which is one of the major differences between VMs and Containers. Since each VM operates on its own operating system, it quickly takes up a lot of RAM and CPU cycles.  Containers, on the other hand, only virtualize the operating system, making them exceptionally light and only megabytes in size. This also extends into the benefit of being able to start the computer in seconds, as opposed to several minutes when using VMs.

Here’s a quick look at the main differences between the two:

  • Containers are lighter
  • VMs have limited performance
  • VMs run in its own OS; containers share the host OS
  • VMs virtualize at a hardware-level; containers only virtualize the OS
  • Containers startup in milliseconds, compared to minutes with VMs
  • VMs take up a lot of resources; containers require less memory space
  • VMs are fully isolated due to the individualized OS

What’s the Better Choice?

VMs and containers present their own unique pros and cons to IT infrastructures.  VMs are an affordable option that may be the better choice if you’re running applications that require access to all of the operating system’s resources. They are also beneficial when you’re running several applications on servers or when you have a vast array of operating systems to manage.

Containers are typically the better and most popular choice, particularly for businesses who want to maximize the number of apps running, using minimal servers.

Nunatix

With Nutanix, you can easily manage your virtualization environment by hosting Docker containers in a VMs, bringing simplicity based on a “build, run, and deploy” principle. You can take advantage of container migration, streamline network and security configurations and harness persistent storage for both local registries and containers.

Here’s a quick look at the benefits of Nunatix:

  • Single, easy-to-manage virtualization
  • Built for simplicity
  • Packages applications and their possessions into flexible, highly portable virtual containers
  • Allows for container migration on VMs
  • Persistent storage for containers and local registries
  • Streamlined network
  • Security configuration

Nuntanix supports many of the most popular container platforms, such as Docker containers on AHV, as well as VMs. It’s a versatile option that is ideal for every enterprise looking for rapid deployment and scale.

AI Robot Defense

AI, Your Silent Cyber Defender

AI (Artificial Intelligence) Cyber Defense or SIEM (Security Information and Event Management) or both?

Cybercriminals continue to come up with innovative ways to attack companies’ infrastructures, more so in the last year than ever before. Instead of hacking an infrastructure to plant a virus, cybercriminals have taken their tactics to the next level, executing cloud ransom attacks, as well as manipulation and exploitation tactics in return for access to your own IT infrastructure. This raises the question as to whether traditional cyber defense systems are equipped to detect and respond to such unique threats quick to reduce the reduce the risk of damage. Artificial Intelligence Cyber Defense has the answer.

What is Artificial Intelligence (AI) Cyber Defense

Artificial Intelligence (AI) has transformed the digital world and the way in which businesses operate. With AI, enterprises are able to leverage businesses advantages, such as increased productivity as a result of faster operation times and automated efficiencies. Though, the risk of cyber threats remain as cybercriminals view digitization as an opportunity to hack digital infrastructures, such as the Cloud, Internet of Things (IoT) devices, and software systems. Traditional security systems are not equipped to handle the increased and innovative attacks from cybercriminals.

How Artificial Intelligence Cyber Defense is Transforming How Companies Deal with Security Threats

Artificial Intelligence (AI) Cyber Defense, on the other hand, takes cybersecurity to the next level to protect enterprises from digital threats. From cloud ransom attacks to cybercrime exploitation, AI Cyber Defense systems offer optimal protection from past, present and future cybercriminal tactics. These systems use smarter automated security systems that detect threats and respond to them before they pose a risk to your infrastructure. AI Cyber Defense continues to learn for themselves and use AI algorithms to identify outliers from typical patterns which not only protects your enterprise from today’s threats but also, from potential ones in the future. These systems detect threats that can’t be predicted by an IT individual, while also acting faster than any security practitioner to prevent costly damage.

Benefits of AI Cyber Defense Darktrace System

Darktrace is an innovative, intuitive and impressive AI cyber defense system that makes protecting your business’ most valuable data simple. The automated system uses mathematics and algorithms to see threats earlier or as they’re happening, offering quick and efficient protection to threats that other security systems aren’t equipped to find.

What is Security Information and Event Management (SIEM)

Security Information and Event Management (SIEM) is a type of security management that uses Security Event Management (SEM) and Security Information Management (SIM) in one security management system. This type of software is designed to collect and combine data that’s generated through an enterprise’s infrastructure, from the host systems to the network, applications and security devices, such as antivirus programs and firewalls. SIEM systems then use the information gathered to identify, categorize and analyze various events, and produce reports and to send alerts on security-related situations.

While SIEM systems have been a strong force against cybercriminals in the past, they are not as equipped to deal with the new and advanced threats when compared to AI Cyber Defense Systems.

AI Cyber Defense system can be used alongside a SIEM and will enhance its value. However, this type of defense is exceptionally powerful and well-equipped to handle all security threats on its own, and in faster time than other security systems. AI Cyber Defense systems also allow companies with limited resources to do more with less; paying for two security systems is not needed for optimal protection.
Darktrace is the world-leading cyber-threat company that enhances the security surrounding your enterprises’ infrastructure. For more information, speak with one of our specialists today.

CDN Channel Innovation Awards: Zycom

Zycom showed up in full force to the inaugural CDN Channel Innovation Awards on September 19, 2018!The innovation awards turned out to be a great night and recognized some of the most innovative solutions across Canada.  Zycom is excited to announce that we were recognized for a pair of awards commending our top projects of 2017. Our unique solutions ended up landing Zycom both a Diamond and Gold Award.

 

Multiple achievements being awarded to Zycom:

Diamond for Best Education Solution

This award honors the solution provider with the most innovative education solution for the year.

Why we won:

Zycom was tasked with creating a cloud strategy for the Limestone District School Board in Kingston, specifically to update its student information system (SIS). Overall the SIS lacked the data and disaster recovery capabilities that a system tracking 19,000 students should always have. The switch to a cloud-based solution, which houses the data of all students, reduced the school boards IT costs by 70% or roughly $70,000 per year.

 

 

 

 

Read more about this here.

Gold for Top (Federal/Provincial/Municipal) Government Solution

This award recognizes the most innovative and problem-solving hardware, networking, mobile, cloud, big data or software-defined solution for a federal, provincial or municipal government, or crown agency.

Why we won:

The City of Greater Sudbury along with Nutanix and Zycom, came up with a way to deploy a 3node Dell XC Nutanix solution in an innovative way via a stretch cluster, due to the low latency of fibre links and the SCADA applications very low network traffic requirements. That solution would see a node at each of 3 plants being connected via fibre with the SCADA application, virtualized and now resilient between plants (i.e., replicated VMs with Nutanix software). Resulting in a viable and secure solution that can be easily backed up and eliminating downtime and dispatch. This means that should the CGS lose connectivity to a SCADA server that has remote wells and treatment centers reporting to it, the lost server instance will migrate to an alternate node at one of the other primary sites. Thus, the City’s networking and routing capabilities allow the affected remote reporting sites to continue to be monitored, and SCADA operational visibility is maintained for all outside wells and lift stations.

In Addition to applying of a software-defined solution to solve a monitoring uptime requirement for SCADA, the same technology was applied to a technology refresh of tier-3 infrastructure resulting in the IT Department at the City of Sudbury being able to save and return $600,000 in capital funding back to the municipal government.

Read more about these solutions via our case study here.

The event was a great experience overall; nights like this are exactly what gives us the confidence to continue to innovate and disrupt the industry.

Read the official press release of the event.  

Zycom Celebrates 20 Years in the Industry! | The Revolutionary Changes That Shaped The Tech Industry Over the Past 20 Years

From the minimal use of personal computers to a digital era, the tech industry has experienced some significant changes over the past 20 years. The year 1998 marks the beginning of Zycom resulting in the 20th anniversary being celebrated on November 1st, 2018. To commemorate the many digital transformations that occurred throughout the past two decades, we thought it appropriate to take a look at the revolutionary changes we have evolved through, that shaped the tech industry into what we all know and appreciate today.

 

1998 – 1999: The Beginning

The Zycom Launch

Despite technology being used for several years prior, 1998 truly marks the year where technology was brought into people’s homes. Compaq purchased Digital Equipment Corporate which led to Tim Allen leaving the company and starting Zycom with Mike Lucas. VMware was also founded the same year, and IBM announced 170MB and 340MB Microdrive that fit on an inch platter.

The following year, Mellanox Technologies was founded and the first Blackberry was released by RIM with many still using Blackberry devices today. Intel Pentium III 500MHz was released, with Intel Pentium III 600B MHz quickly following which was cutting edge at that time. And most notably, 802.11b WiFi standard was released by IEEE, beginning the end of the atrocious sounds you’d hear if you called someone while they were using their dial-up internet.

2000: The Millennial Milestone

The Anticipated Y2K Crash

Inarguably, one of the most significant milestones that occurred within the past 20 years is the highly-anticipated Y2K crash of technology. As the year approached, many believed that computer programs would crash and all electronic devices would fail as the year shifted from 1999 to 2000. Despite all the excitement surrounding the potential Y2K apocalypse, technology made a smooth transition into the new year, sans any technical delays, failures or outages.

The Dot-Com Bubble Bursts

The millennial year also marks the dot-com boom, where many Internet companies were launched. Investors assumed that all online companies were going to be worth millions but when many didn’t reach optimal success, the tech companies stocks crashed resulting in many investors taking significant losses.

The ILOVEYOU Virus Takes Over

One of the worst computer viruses seen over the past 20 years was the ILOVEYOU computer worm that attacked millions of Windows computers. This virus has the ability to destroy data and resulted in approximately $10 Billion worth of damages (CNet).

The Tech Industry Produces Revolutionary Upgrades

It wasn’t all bad news for the tech industry in 2000. Windows launched its revolutionary Windows 2000 software, and Intel and AMD broke record speeds on processors with their 1GHz. Super DLT Tape released with 110GB of capacity, LTO-1 launched with 100GB capacity and Seagate produced a 15,000 RPM HDDs, introducing new power never seen before.

 

2001 – 2005: From Business to Personal

The Beginning of Social Media

Throughout the 5 years that followed the millennium, technology changed from a business necessity to an entertainment device essential in every home. In 2002, LinkedIn registered their business, launching the following year which introduced revolutionary technologies that showed the world the potential of the Internet. Though, it was a couple of years before this business social media platform took off, as Myspace was founded the same year and took precedence on everyone’s computer screens. Come 2004, The Facebook, which we now know as Facebook launches, creating a new way to communicate with people across the globe. Google Gmail follows in suit with their debut which resulted in many people trading in their Hotmail and Yahoo email accounts for, and in 2005, YouTube launched its innovative platform that changed the face of entertainment.

The MyDoom Virus

As technology continued to advance, so did the computer viruses. 2004 marks the year of the MyDoom Virus, a vastly damaging worm that spread through emails like wildfire, creating a backdoor in computers’ operating systems.

The Tech Industry Continues to Improve

Anyone who had a computer in 2001 can relate to the excitement surrounding the 2001 launch of Windows XP, a progressive new platform that many still run today. But the advancements didn’t end there. In 2002, SATA introduced 1.0, Dell becomes the largest PC maker, Hitachi buys IBMs HDD business and HP announces a plan to buy Compaq.

Come 2004, IBM sells their computing division to Lenovo and in 2005, the SAS interface was introduced and the first ever 500GB HDD ships, sparking a new series of changes within the tech industry.

 

2006 – 2010: The Era of Cutting Edge Technology

The Cloud is Coined

The launch of social media continued into 2006 with Twitter launching but it was the cloud computing software that captivated everyone’s attention, coined by Google CEO. Following in suit, Dropbox was founded the following year and in 2010, OpenStack established their open-source Cloud platform services.

The Apple Trend

Today, Apple is a leading brand within the tech industry and it all began in 2007 when Apple launched their first iPhone.  Only a couple of years later in 2010, Apple launched the iPad which introduced the first consumerization of IT which lead to VDI tacking off to enable secure applications on BYOD devices. The same year, Zycom launches its VDI practice to accommodate this new technology trend.

The Ups and Downs

Between 2006 and 2010, the tech industry experienced many ups and downs. In 2006, it was all positive, with AWS opening Amazon.com, Intel Core 2 Duo processors being launched and Seagate hitting 750GB with HDD. During the years that followed, technology was still on the rise, with Microsoft releasing their first hypervisor in 2008, Google releasing their Chrome web browser and VMware VCM became VMware View V3.

In 2009, Nutanix was founded, Probook was launched by HP and Microsoft released Windows 7. However, Nortel declares bankruptcy protection which resulted in tech stocks plummeting.

Heading in 2010, technology continued to improve with Oracle completing their acquisition of Sun Microsystems.

 

2011 – 2015: The IT Transformation

The Details Count

Throughout the years of 2011 to 2015, the IT industry began to transform and shape into what the world uses today. 2011 marks the year of Chromebooks, and in 2012 Dell completes acquisition of Wyse and Cyptolocker was discovered.

Come 2014, DDR RAM makes Its debut and 8Tb HDD ships from Seagate, marketing the beginning of new storage.

The Zycom Transformation

Zycom launched our first cloud-based DRaasS in 2012. The following year in 2013, Zycom began selling Nutanix for VDI use cases which led the company on an innovative adventure toward a journey in IT transformation. Software Defined Storage became a buzz phrase used throughout the tech industry, HCI became a dominant acronym, and Zycom begins to pioneer HCI services in Canada resulting in significant growth in IT transformation.

 

2015 – Today: The 20th Anniversary

The Advanced Gets Advanced

Just when you think technology couldn’t become any more advanced, 2015 rolled around. During this year, Microsoft released Windows 10, Dell enters into an agreement to acquire EMC, HP splits into HP and HPE and Zycom sells the first hyper-converged Rubrik data recovery appliances in Canada.

In 2017, LTO-7 Tape drives get a powerful upgrade with 12TB native uncompressed capacity, SATA HDDs reach the 12 and 14TB capacity range SSDs begin shipping at 3.84TB capacity and Zycom grows 39% year over year.

The 20th Celebration

As of November 1st, 2018, Zycom formally celebrates a successful 20 years of operations, growth, success and digital transformation.

We continue to look to the future for emerging technology that will help change the face of Information Technology to help business evolve in a digital economy.

We would like to thank our valued customers and all of our staff and vendor partners that have helped us achieve this milestone in Zycom’s history.

 

 

 

 

 

The Digital Workforce

The future of the workforce is going digital. With the evolution of technology quickly progressing, a variety of automated and robotic solutions are being made available to businesses interested in driving productivity efficiencies. However, the digital workforce is not a physical representation of a digital worker. Instead, a digital workforce utilizes virtual software to enhance the experience staff and consumers have with your business.

The Significance of The Digital Workforce Today

CNBC recently reported that a study conducted by IWG found that approximately 70% of professionals work remotely at least one day a week. 53% of people work remotely for at least half of the week.

This stems from the changing attitudes associated with work environments and whether the traditional nine-to-five working hours are the most beneficial to both staff members and the companies in which they work for.

By utilizing a digital workforce that enables employees to work in an environment that’s best suited for them, it’s believed that their views of the company are transformed, thus allowing them to be more productive.

In fact, a study reported by Forbes found that remote workers made 13.5% more calls than their coworkers who worked in the company’s office. Additionally, 91% of people who work from home feel that they’re more productive than when they are in the office. When Best Buy introduced a more flexible work program, they experienced a 35% increase in employee productivity, and when ConnectSolutions did the same, they found that 77% of remote workers were able to get more done in less time.

Components of a Digital Workforce

Transforming to a digital workforce isn’t as simple as sending your employees off with their computer to work at home.  There are various components that make up a digital workforce, such as the following:

WiFi connected and ISP connected

Remote workers need to have reliable, secure, fast Internet to ensure productivity efficiencies. Thus, a key component of a digital workforce is to have WiFi connections and ISP connections readily available for employees telecommuting.

Endpoints and newer desktop as a service (managed endpoints as a service)

Endpoints are the remote computing device that communicates with a network, such as any desktops, laptops, tablets, smartphones, servers or workstations used remotely. It’s imperative that any company undergoing a digital transformation has implemented the right software and technology that allows them to manage them.

Next-gen endpoint security

The utmost vital component to a digital workforce is security. With employees working remotely, you must have the highest grade of security installed on all devices to secure your data while also avoiding internet threats and attacks.

Notebooks and tablets

Notebooks and tablets present a plethora of advantages within the workplace, such as access to crucial documents, manuals, and books. This allows remote workers to increase speed and agility anytime, anywhere.

For example, Alaska Airlines equipped all of its pilots with iPads that hold the various flight manuals needed. By doing so, their pilots are able to access valuable information quickly from any location. Additionally, it’s environmentally-friendly and a more secure way to share crucial documents in comparison to physical manuals.

Portable 2nd USB monitors

You’ll need to provide your remote workers with reliable displays to enable optimal productivity. However, the displays must be portable to allow for easy transportation to and from locations. 2nd USB monitors ensure remote workers have a reliable display that they can hook up to their computer, sans complications and frustration as the monitors require nothing more than a USB port. This makes them power efficient and exceptionally portable.

Portable printers and scanners

Remote workers will need access to printers and scanners; supplying both machines to each remote worker is costly. However, with portable printers and scanners, remote workers are able to take the devices to their remote office when needed, as they’re compact enough to easily be transported to and from different locations.

Office productivity suites like Office 365

Office 365 is a web-based productivity tool designed to allow companies to capitalize on the power of collaboration. It helps remote workers work together on projects and communicate securely across any device. Put simply, it provides employees with vital business programs (Word, Excel, Outlook, Sharepoint, OneDrive for Business, Skype for Business, etc.) which can be accessed securely, anytime and anywhere, with optimal reliability.

Collaboration spaces like Microsoft or Cisco Teams and SharePoint

Workers need to have collaboration spaces that allow them to work together, despite being in separate locations. Advanced technology has made collaboration fairly easy amongst some of the biggest platforms, such as Adobe and Microsoft programs. There are also various apps dedicated solely to bringing together a team of remote workers such as Cisco Teams and SharePoint.

This component to a digital workforce is, inarguably, one of the most crucial ones to your success, as they enable remote workers to participate in meetings, group messages, collaboration, file-sharing and more.

 

 

The digital workforce is the future of business. To learn more about digitally transforming your company, contact Zycom today.

Data Center Storage Futures

Data storage technology is taking a giant leap in evolution, offering the most powerful capabilities the world has yet to see, in compact form, here are some highlights you need to know about the future of data storage technology and what it means for you.

Storage Density

Less is about to become more in terms of size, with mind-blowing speed, performance and storage capacity. The biggest change you can expect from data storage technology in the future is the physical size of the storage technologies. The industry is beginning to roll out astonishingly compact storage devices that will have you wondering how you ever survived with a hard drive, USB stick or floppy drive.

What’s so revolutionary about this new technology isn’t necessarily the size but rather, the power that can be found in such density.

HCI Footprint

The new, compact storage devices will play a significant role for the HCI footprint, allowing you to plug more into your integrated server via the new infrastructure. Previously, you had a select few options to increase storage. You could update an existing HCI node via swapping or adding an HDD or SDD, in order to increase storage or you could add a storage-heavy node to increase storage but it would come with limited compute.

With the revolutionary new storage technologies, you can expect the highest density the industry has seen. The traditional Tier-3 storage architectures have already been knocked down to more than 16:1 consolidation ratios for customers and the future of storage is expected to double the current capacity.

To put it into perspective, Nutanix 2U block/chassis structure in 2015 offered 4 host servers with a max of 16.6TB of raw capacity per 2U footprint in the data center. Today, the same 4-node Nutanix 2U block offers 55.6TB of raw capacity per 2U footprint in the data center. This is a 315% increase in storage density.

Come the future, if HCI technology begins to incorporate the anticipated 20TB HDDS for enterprise use, it is expected that the Nutanix 2U structure will have a minimum 200% increase in storage density, offering 111.2TB of raw capacity per 2U footprint in the data center.  We say, bring it on!

PMR vs HAMR

HAMR is predicted to follow in suit with the new data storage changes, bringing hard drives that use Heat-Assisted Magnetic Recording (HAMR) to the forefront. This is a significant change from Perpendicular Magnetic Recording (PMR), a method used almost exclusively prior to the introduction to HAMR.

Advantages of HAMR over PMR

HAMR offers immense storage capacity and storage density that simply isn’t offering in PMR form, so much so that they’re on track to providing 20TB drives by 2019, with a predicted 30% compound annual growth rate in data density moving forward. By 2023, you can expect to see 40TB or higher by 2023. To put it into perspective, that’s an improvement of 10X over PMR drives.

Additionally, HAMR drives offering more reliability due to the glass media used in their manufacturing. Each part that makes up a HAMR drive gets be heated and cooled in a nanosecond, and although a laser diode is used during the writing process, it has no impact on the temperature, stability or reliability. As a result, HAMR offers trustworthy technology the industry has yet to see.

From floppy disks to zip drives, USB sticks and hard drives, data storage technologies have come a long way in a significantly short period and it’s not about to end. 2018 starts the beginning of a revolutionary technology that will inevitably shape the future and the way businesses operate moving forward. It will be interesting to see where the physics of RAID algorithms fair compared to companies opting for distributed file systems which offer faster rebuild times. Whilst Zycom is eager to deploy and play with these forecasted larger data drives for HCI in particular, we do so with extreme curiosity for the results of rebuild times in RAID versus DFS.

 

vSphere v5.5 End of Support September 19th

As the industry prepares to upgrade from vSphere V5.5 to V6, it’s time to ensure your cloud computing virtualization platform follows suit. The general support timeframe is scheduled to come to an end as of September 19 and to continue leveraging the power of virtualization, you’ll need to upgrade to V6.

Benefits of Upgrading to V6

While there are some other alternatives to upgrading to V6, it’s strongly recommended to continue with the vSphere platform. Aside from maintaining the same level of quality and support you require, upgrading to V6 provides you with new enhancements and features, such as:

  • vCenter Migration Tool

o    Built-in migration tool to assist with moving to the new version

  • vCenter High Availability

o    Ability to create clusters of vCenter appliances

  • Updated Web Client

o    HTML5 web client is improved

  • Encryption

o    Ability to encrypt VMs at the hypervisor level and on a per-VM basis

  • HA and DRS

o    Improvements such as network utilization improved DRS load balancing and proactive HA

  • Storage

o    Increased LUNs, NFS v4.1, advanced format drives, and automated UNMAP

Preparing to Migrate Off of V5.5

Migrating off of vSphere V5.5 is a complex task that comes with many prerequisites. It’s recommended to take note of the following computer systems and software capabilities to get the process started:

  • Any VMware solutions associated with your environment
  • Any other third party solutions associated with your environment
  • Any databases used with vCenter Server, vSphere Update Manager or any other VMware solutions associated with your environment
  • The database’s current compatibility level (and if it’s supported with V6)

Migrating off of V5.5

The next step is to determine which versions of VMware solutions are compatible with vCenter Server 6.0 and to then follow a specific sequence to upgrade your IT infrastructure properly. This can be a complicated task without extensive vSphere and IT knowledge.

 

Support for vSphere V5.5 ends on September 19th. Contact Zycom today for more information and/or for assistance migrating off of V5.5.

  • 1
  • 2
  • 6
zycom
Contact Us

© 2019 Managed IT Services - Zycomtec.com. All Rights Reserved.