Tag Archive for Cloud computing

The Future for Avaya is Cloudy

The Future for Avaya is CloudyBack in 2017 former telephony giant Avaya (AVYA) declared bankruptcy. Since then there has been a number of attempts to break up the firm. Extreme bought the Avaya network division in 2017. In 2019 there were rumors that Mitel was going to attempt a leveraged buyout of Avaya.

RingCentral will pay Avaya $500MEventually, Avaya made a deal with Unified Communications as a Service (UCaaS) vendor RingCentral (RNG) to save its bacon. With the deal, RingCentral will pay Avaya $500M and will be Avaya’s exclusive provider of UCaaS solutions. The two firms announced the “strategic partnership” in October 2019.

It’s February 2020 and the Avaya – RingCentral collaboration will start to show some results – next quarter. The beleaguered vendor announced at its Avaya Engage love-fest that beginning March 31, that in the U.S. the unimaginatively named Avaya Cloud Office by RingCentral (ACO) will be identical in features to the product RingCentral sells today. The rest of the world will have to wait – because RingCentral UCaaS is only available in seven countries.

additional Avaya features will creep into the offering through 2020It is reported that a few additional Avaya features will creep into the offering through 2020. The first two are targeted for release this summer are bridged appearance, and call park and page. Bridged appearance lets two desk phones maintain separate and shared lines, a feature typically used between assistants and their bosses. With call park and page, when a person places a call on hold, the system will automatically send a page to another department or user to pick up the call. The feature is particularly useful to retailers.

Towards the end of 2020 or later, the vendor expects to deliver features that include line appearance, call appearance, hotdesking, and support for the venerable Avaya Audix voicemail service.

Initially, Avaya Cloud Office by RingCentral will only work with three models of Avaya’s J series desk phones: 139, 169, and 179. Avaya will work with RingCentral to certify B series conference room phones, L series headsets and the CU360 video conferencing system. However, most IP Office customers are likely using older devices, given that Avaya launched the J series only one year ago.

Avaya is also developing software to automate the process of migrating settings and users from its legacy gear to the cloud, although that tool won’t be available until later in 2020.

rb-

No Jitter points out that faced with the threat of its large installed base that goes back to legacy Nortel platforms, dumping Avaya – Avaya needed to do something.

To me this looks more like a win for RingCentral. For a relatively small investment ($500M on a market capitalization of $10.5B), RingCentral becomes the preferred UCaaS provider for the large Avaya installed base (100M+ seats) likely planning on a move to the cloud. Meanwhile, Avaya picks up a fully developed UCaaS to sell – if it can execute. Which has been its problem all along.

Can Avaya hold on long enough to develop the promised automation tools move complicated things like CMS to a cloud interface? – we will see.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Black Hole Data

Black Hole Data The first image of a black hole was published on April 10, 2019. The black hole, M87* at the center of the Messier 87 galaxy is located 53 million light-years away from Earth. NASA says a black hole is an extremely dense object from which no light can escape. Anything that comes within a black hole’s “event horizon,” will be consumed, because of the black hole’s unimaginably strong gravity.

the first image of a black hole

By its very nature, a black hole cannot be seen, the bright ring in the picture is the event horizon, the point where an object approaching a black hole is unable to escape its gravitational pull. Objects that pass into the event horizon go through spaghettification, a process, first described by Stephen Hawking, where gravitational forces stretch the object out like a piece of pasta. The M87* image shows a silhouette of the black hole against the glow of the event horizon captured by researchers at the Event Horizon Telescope (EHT).

APEX Atacama Pathfinder Experiment antenna.The EHT is the brainchild of Shep Doeleman, the director of EHT and astronomer at the Harvard-Smithsonian Center for Astrophysics. It is a virtual global array of eight ground-based radio telescopes. The EHT captured around 3.5 PB of data for the black hole image in April 2017. It then took two years to correlate the data to form the image. The EHT team not only had to figure out intergalactic science but also massive information technology problems. The researchers had to solve IT problems pretty typical for enterprise IT professionals, only bigger.

According to an article at SearchDataBackup each EHT telescope can record data at a rate of 64 Gbps, and each observation period can last more than 10 hours. The author calculated that each site generated around half a petabyte of data per run. The distributed locations included volcanoes in Hawaii and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica. The sites were kept in sync using precise atomic clocks and GPS systems to carefully time the observations.

The data from each telescope was recorded at 16 Gbps and distributed among a total of 32 hard disk drives grouped into 4 modules of 8 disks each. The EHT can record a total rate at each site of 64 Gbps by using 4 units in tandem.

Sites making up the virtual Event Horizon Telescope.

 

One problem EHT ran into was the failure rate of traditional hard drives in the extreme telescope locations. ComputerWorld reports that 28 of 32 conventional hard drives failed at the Sierra Negra telescope, on the top of an extinct volcano in Mexico.

WD 10TB helium disk driveSearchDataBackup says the solution was helium hard drives. The hermetically sealed helium drives are self-contained environments, so they could survive the extreme environments in which EHT’s telescopes operated. EHT first deployed helium hard drives in 2015. EHT data scientist Lindy Blackburn told SearchDataBackup that EHT now uses about 1,000 helium drives with up to 10 TB of capacity from Western Digital, Seagate, and Toshiba. He told SearchDataBackup,

The move to helium-sealed drives was a major advancement for the EHT … Not only do they perform well at altitude and run cooler, but there have been very few failures over the years. For example, no drives failed during the EHT’s 2017 observing campaign.

The amount of data collected by EHT was too much to send over the Internet so the researchers went old-school and used FedEx sneakernet style to send the data to be processed. Geoffrey Bower an astronomer in Hawaii told ScienceNews that mailing the disks is always a little nerve-wracking. So far, there have been no major shipping mishaps. But the cost and logistics involved with tracking and maintaining a multi-petabyte disk inventory is also challenging. Therefore, EHT is always on the lookout for another method to move petabyte-scale data.

Cloud computing

`

 SearchDataBackup points out that normally the cloud would be a good option for long-term storage of unifying data sourced from multiple, globally distributed endpoints. However, Mr. Blackburn told them the cloud was not a cold storage option for the project. He said the high recording speed and the sheer volume of data captured made it impractical to upload to a cloud. He explained, “At the moment, parallel recording to massive banks of hard drives, then physically shipping those drives somewhere is still the most practical solution.”

The data collected on the helium hard disk drive packs were processed by a grid computer made of about 800 CPUs all connected through a 40Gbps network at the MIT Haystack Observatory MA, and the Max Planck Institute for Radio Astronomy in Germany.

Katie Bouman is the MIT student who developed the algorithm that pieced together the data from the EHT with disk drives

Geoff Crew, co-leader of the EHT correlation working group at Haystack Observatory told SearchDataBackup It is impractical to use the cloud for computing. Mr. Crew said;

Cloud computing does not make sense today, as the volume of data would be prohibitively expensive to load into the cloud and, once there, might not be physically placed to be efficiently computed.

The EHT scientists built algorithms that converted sparse data into images. They developed a way to cut the number of possible images by sorting out which results were physically plausible and which were wildly unlikely making it less hard to create the images.

The Haystack VLBI Correlator grid computer at the MIT Haystack Observator

Converting sparse data into images matters beyond astronomy. Mr. Blackburn told 538 the problem comes up in other areas as well; it occurs in medical imaging when doctors use MRIs to convert radio waves into pictures of your body. It’s also a key part of self-driving cars, which rely on computer visualization to “see” everything from potholes to people.

data protectionJust like any enterprise, EHT had to find a workable method of data protection. That includes deciding what won’t be protected. EHT has not found a cost-effective way to replicate or protect the raw radio signal data from the telescope sites. However, once the data has been processed and reduced to tens of petabytes it is backed up on-site on several different RAID systems and on Google Cloud Storage. Mr. Crew told SearchDataBackup;

The reduced data is archived and replicated to a number of internal EHT sites for the use of the team, and eventually, it will all be publicly archived. The raw data isn’t saved; we presently do not have any efficient and cost-effective means to back it up.

Mr. Blackburn said the raw data isn’t worth backing up. Because of the complexity of protecting such a large amount of data, it would be simpler to run another observation and gather a new set of data. Mr. Blackburn said; “Backing up original raw data to preserve every bit is not so important.”

Mr. Blackburn said he can’t seriously consider implementing a backup process unless it is “sufficiently straightforward and economical.

Instead, he said he’s looking at where technology might be in the next five or 10 years to find the best method to handle petabyte-scale raw data from the telescopes. Mr. Blackburn told SearchDataBackup;

Right now, it is not clear if that will be continuing to record to hard drives and using special-purpose correlation clusters, recording to hard drives and getting the data as quickly as possible to the cloud, or if SSD or even tape technology will progress to a point to where they are competitive in both cost and speed to hard disks

rb-

The image of the black hole validated Einstein’s general theory of relativity and proves that enterprise-class IT can solve intergalactic problems.

The EHT team had to figure out how to save, move and backup massive quantities of data and of course do more with less. EHT’s Geoff Crew summed up the problem most IT pros have; “Most of our challenges are related to insufficient money, rather than technical hurdles.”

Related articles
  • Trolls hijacked a scientist’s image to attack Katie Bouman. They picked the wrong astrophysicist. (MSN)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Avaya LBO Buzz

Avaya is back in the news. Followers of the Bach Seat will recall that Avaya declared bankruptcy in 2017. Now the buzz is that Santa Clara, California-based telecommunications equipment and software firm is considering a leveraged buyout offer.

Avaya logoReports are circulating that Avaya’s (AVYA) board of directors is evaluating an offer from an unnamed private equity firm. Reportedly the offer values the Lucent spinoff at more than $20 per share, people in the know told Reuters. The private equity firm values Avaya at more than $5 billion, including $3.2 billion in debt.

Avaya is one of the world’s largest providers of telephony systems. It was spun off from Lucent Technologies Inc in 2000, which used to be part of AT&T (T). The LBO comes 15 months after Avaya emerged from bankruptcy protection, with a $8.3 billion debt legacy from a previous leveraged buyout by private equity firms TPG Capital and Silver Lake in 2007.

unified communications as a serviceAvaya has tried to shift its revenue model to focus on cloud-based communications solutions with recurring software and subscriptions fees and not its traditional hardware business. Its legacy business is becoming more commoditized and dated. Much of Avaya’s new focus involves cloud services like unified communications as a service (UCaaS) and Contact Center as a Service (CCaaS). A new Device as a Service (DaaS) offering has also surfaced.

Avaya’s contact center business has also attracted acquisition interest in the past from private equity firms, including Clayton Dubilier & Rice LLC, Hellman & Friedman LLC, and Permira Advisers LLP. Hellman & Friedman and Permira own Genesys an Avaya competitor.

As of September 2018, Avaya had about 8,100 employees worldwide, including 2,800 in the U.S.

Private equity firms have recently focused on communications businesses. Among those companies are Aspect Software, Mitel,  and PGi, each privately held by such firms. Note, too, that Polycom had been a Siris Capital property until its recent acquisition by Plantronics.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Holy Sell Out Batman

Batman is being used to pump 5GThe Caped Crusader has sold out. While the full benefits of next-generation wireless – 5G won’t be realized until at least mid-2020, Batman is being used to pimp 5G.  AT&T used The guardian of Gotham to create demand for mixed-reality at last month’s Mobile World Congress in Barcelona Spain. The mixed-reality experience featured DC Comics Batman and the Scarecrow battling it out on the MWC show floor.

Fierce Videoaugmented reality headset reports that AT&T (T), Ericsson (ERIC), Intel (INTC), and Warner Bros., with DC, are using 5G technology and edge computing to build a location-based mixed-reality experience. For the walk-in experience at MWC, visitors put on an augmented reality headset. There they witnessed a 2 to 3-minute experience.

Ade Kushimo, director of business development, IoT, and emerging business at Ericsson told Fierce Video, “The really cool part of the experience is going to be the fact that you have this virtual, digital content being embedded into your physical space. That gives you that mixed reality experience.

Sensorama (patented 1962) which was an arcade-style theatre cabinet that would stimulate all the senses, not just sight and sound.

Mixed reality experience with Batman

Doug Matheson, vice president of strategic business development at Ericsson, said the proof-of-concept experience demonstrated that 5G technology (both radio and core) could be combined with intellectual property to create a mixed reality experience that’s both mobile and untethered.

In order to create a good mixed-reality experience, image lag has to be kept to a minimum. Image lag will make you dizzy and ruin the experience. That means that compute power has to be pushed out to the edge of the network to reside closer to the end-user. The compute power needed to process a mixed reality experience can’t live in a centralized data center somewhere.

Cloud computingThe cloud and edge network architecture allows for heavy computing to be done away from the device. So, the goal is to shift processing to the cloud and transport it there using a 5G network. The Batman demo ran on a fully integrated 5G network using Ericsson radio base stations 5G network technology will help supply the lower latency and higher speeds and enabled by Intel Xeon processors and the Intel 5G mobile trial platform.

5G – What is it

rb-

Mobile Marketer says that 5G will have a huge impact on AT&T’s mobile network. Its data traffic has grown more than 470,000% since 2007, with video making up half of the mobile data. Video may expand its share of data traffic to more than 75% by 2022, according to the company’s estimates.

Batman now works for AT&T following its acquisition of Time Warner who owned Warner Brothers, which owned DC Comics, the home of Batman, Superman, Wonder Woman, Harley Quinn, the Joker, Lex Luthor, Oswald Cobblepot, and the Flash.

 

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

VMWare Crossing the Streams

The Ghostbusters warned us. Egon Spengler (Harold Ramis) warned us not to cross the streams. You should not cross the streams because as Raymond Stantz (Dan Aykroyd) explained it would cause total protonic reversal. Despite the warning, VMware is crossing the streams.VMWare Crossing the Streams

Rumors have it that Dell/EMC/VMware and Microsoft (MSFT) are crossing their streams with a VMware Cloud NSX on Microsoft Azure partnership could be coming soon.

VMware NSXVMware’s (VMW) multi-cloud approach combines the core VMware technology stack with services delivered through partnerships with other service providers including Amazon (AMZN) Amazon Web Services (AWS) Google Cloud and IBM Cloud. As well as an emerging development environment centered on the open source Kubernetes container orchestrator. Chennele2e hypothesizes,

The two companies are jointly developing software that will let their customers more easily run computing jobs, which rely on VMware software, inside Microsoft’s Azure cloud computing service … could be announced … in the coming weeks … move computing chores from their own private data centers, where VMware’s software is a critical ingredient, to Microsoft’s “public” cloud service.

In the past, VMware CEO Pat Gelsinger described a range of cross-platform work — including:

  • Azure: NSX and VDI with more VMware management products for Azure are on the way.
  • Google Cloud Platform: VMware has partnered with Google and Kubernetes. Also, Android- and  Chromebook-related offerings.

As the slide below shows, the deal with Microsoft links VMware to most of the enterprise VM’s in the cloud. What impact will the VMware-Microsoft deal impact the VMware-AWS relationship? Will AWS continue to enjoy “most favored nation” status in VMware’s public cloud partner ecosystem?

The number of virtual machines in the cloud - Enterprise based on Right Scale estimates

The Redmond Channel Partner points out that former VMware executive Ray Blanchard, who was in charge of the VMware partnership with AWS joined Microsoft a year ago.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.