Tag Archive for Microsoft

Gartner’s Top Tech Trends For 2012

GartGartner's Top Tech Trends For 2012ner VP David Cearley described their top ten strategic technology trends for 2012 to attendees of the Gartner Symposium IT/Expo. Gartner (IT) defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Here are Gartner’s top strategic technologies for 2012.

Gartner logoMedia tablets and beyond: Bring-your-own-technology (BYOT) at work has become the norm, not the exception.  By 2015 tablet shipments will reach around 50% of laptop shipments and Windows 8 will likely be in third place behind Google‘s (GOOG) Android and Apple’s (AAPL) iOS. The net result is that Microsoft‘s (MSFT) share of the client platform, be it PC, tablet, or smartphone, will likely be reduced to 60% and it could fall below 50%, Mr. Cearley says.

Apple iPadThe implication for IT is that the era of PC dominance with Windows as the single platform will be replaced with a post-PC era where Windows is one of a variety of environments IT will need to support. Gartner says it expects iOS/Android will dominate the market with 80% of tablets shipped by 2015. IT leaders need a managed diversity program to address multiple form factors, as well as employees bringing their own smartphones and tablet devices into the workplace.

Mobile-Centric Applications and Interfaces. User interfaces with windows, icons, menus and pointers which have been in place for more than 20 years are changing. The UI will be replaced by mobile-centric interfaces emphasizing touch, gesture, search, voice, and video. Applications themselves are likely to shift to more focused and simple apps that can be assembled into more complex solutions. By 2015 half the applications that would be written as native apps in 2011 will instead be delivered as Web apps.

The Internet of Things (IoT)Internet of Things: The Internet of Things (IoT) describes pervasive computing where cameras, sensors, microphones, image recognition, everything, is now part of the environment. In addition, increasingly intelligent devices create issues such as privacy concerns. Gartner says. Drivers of the IoT are:

  • Near Field Communication (NFC) payments allows users to make payments by waving their mobile phone in front of a compatible reader.
  • Embedded sensors which detect and communicate changes are being built into an increasing number of places and objects.
  • Image Recognition technologies identify objects, people, buildings, places logos, etc. that has value to consumers and enterprises.

App Stores and MarketplacesApp Stores and Marketplaces: Application stores by Apple and Android provide marketplaces where hundreds of thousands of applications are available to mobile users. Gartner forecasts that by 2014, there will be more than 70 billion mobile application downloads from app stores every year with an enterprise focus. With enterprise app stores, the role of IT shifts from that of a centralized planner to a market manager providing governance and brokerage services to users and potentially an ecosystem to support entrepreneurs. Enterprises should use a managed diversity approach to focus on app store efforts and segment apps by risk and value.

Big DataBig Data: The size, complexity of formats, and speed of delivery exceed the capabilities of traditional data management technologies; Gartner says it requires the use of new technologies simply to manage the volume alone. One major implication of big data is that in the future users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed will replace the single data warehouse model.

Cloud Computing: This topic is still an important trend. It will become the next-generation battleground for the likes of Google and Amazon (AMZN). Going forward, enterprise IT will be concerned with developing hybrid private/public cloud apps, improving security and governance, Mr. Cearley says. While the market remains in its early stages in 2011 and 2012, it will see the full range of large enterprise providers fully engaged in delivering a range of offerings to build cloud environments and deliver cloud services. Oracle (ORCL), IBM (IBM), and SAP (SAP) all have major initiatives to deliver a broader range of cloud services over the next two years. As Microsoft continues to expand its cloud offering, and these traditional enterprise players expand offerings, users will see competition heat up and enterprise-level cloud services increase.

Cloud ComputingEnterprises are moving from trying to understand the cloud to making decisions on selected workloads to implement on cloud services and where they need to build out private clouds. Hybrid cloud computing which brings together external public cloud services and internal private cloud services, as well as the capabilities to secure, manage and govern the entire cloud spectrum will be a major focus for 2012. From a security perspective, new certification programs will be ready for the initial trial, setting the stage for more secure cloud computing. On the private cloud front, IT will be challenged to bring operations and development groups closer together using “DevOps” concepts in order to approach the speed and efficiencies of public cloud service providers.

Other key predictions Gartner had included:

  • Contextual and Social User Experience: Context-aware computing uses information about an end-user to improve the quality of interaction and anticipates the user’s needs and proactively serves up the customized content. By 2015, 40% of the world’s smartphone users will opt in to context service providers that track their activities with Google, Microsoft, Nokia (NOK), and Apple continuously tracking daily activities Mr.Cearley says.
  • The growing use of flash memory for In-Memory Computing is a long-term technology trend that could have a disruptive impact comparable to that of cloud computing.
  • The adoption of Extreme Low-Energy Servers built on low-power processors typically used in mobile devices will increase for non-compute intensive workloads or delivery of static objects to a website. Gartner says that 10%-15% of enterprise workloads are good for this.
  • Next-Generation Analytics Gartner says over the next three years, analytics will mature from structured and simple data analyzed by individuals to the analysis of complex information of many types (text, video, etc.) from many systems.
Related articles
  • Expecting a recession, Gartner urges ‘creative destruction’ (networkworld.com)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

How Does Malware Spread?

The ZDNet Zero Day blog reports that Microsoft’s (MSFT) recently released Security Intelligence Report identified socially engineered malware (scareware pop-ups; blackhat search engine optimization attacks) enticing users into downloading and executing a malicious file as the most used malware propagation tactic.

ScarewareBased on a sample of 600 million systems worldwide, MSFT research ranks AutoRun USB infection as the second most used malware propagation tactic, according to Zero Day. Microsoft disabled AutoRun by default on Windows XP and Vista in February to prevent malware infections. The results, at least according to Microsoft, have indicated a significant decline in malware using AutoRun as a spreading mechanism.

The report also points out that zero-day flaws do not necessarily represent a driving force in the growth of malicious attacks or cybercrime in general according to the ZDNet blog. More propagation tactics:

  • User Interaction required – 44.8%
  • AuAuto-run malwaretoRun USB – 26%
  • AutoRun: Network – 17.2%
  • File Infector – 4.4%
  • Exploit: Update Long Available – 3.2%
  • Exploit: Update Available – 2.4%
  • Password Brute Force – 1.4%
  • Office Macros – 0.3%
  • Exploit: Zero Day – 0%

Zero Day points out that Microsoft is missing malware that spreads without user interaction, namely through the exploitation of client-side vulnerabilities in third-party software and browser plugins.  The MSFT report says attackers regularly exploit client-side Java. Java exploits were responsible for between one-third and one-half of all exploits observed in the four most recent quarters including:

rb-

I wrote about the problems with old versions of Java and JavaRa which can delete all the old unnecessary files java leaves on your hard drive everything Sun Oracle plugs some more holes in their app.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Cloud Computing Risks

Cloud Computing RisksCloud computing is a term even non-IT folks would have heard about at least once by now fueled by the concept of Software-as-a-Service (SaaS) and virtualization. The idea is that IT services and processing capabilities could be more efficiently housed in a data center and delivered over the Internet based on demand.

Google logoDr. Dobb’s, editor-in-chief Andrew Binstock told FierceCIO that the primary advantage of relying on cloud providers is that their combined expertise on the security and reliability front is in all likelihood better than that of most SMBs and even some larger IT shops.

Bob Violino at Internet Evolution writes that cloud computing offers some clear benefits for organizations: lower costs, automated software updates, greater flexibility, and the ability for IT staff to focus on more strategic projects and not day-to-day maintenance tasks.

Apple logoIt’s easy to get caught up in the cloud excitement with major IT vendors such as Amazon (AMZN), Apple (AAPL), Dell (DELL), Google (GOOG), HP (HPQ), IBM (IBM), and Microsoft (MSFT) pushing the concept and rolling out cloud offerings. But organizations looking into cloud computing need to consider some key risks as well.

Larry Ellison, the chief executive of Oracle, told shareholders in 2008 that Cloud technology is a fad that lacks a clear business model. “I think it’s ludicrous that cloud computing is taking over the world,Ellison said. “It’s the Webvan of computing.”

Microsoft logoRichard Stallman, the founder of the Free Software Foundation, sees cloud computing as a trap that will result in people being forced to buy into locked and proprietary systems that will only cost more over time. He told The Guardian: “It’s stupidity. It’s worse than stupidity: it’s a marketing hype campaign.”

Some of the cloud risks are well documented, but as the push for cloud services continues, a few risk points are starting to come into focus:

Data privacyData Privacy. When it comes to the U.S., the Fourth Amendment states that people should “be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures…” But web-hosted applications and cloud services are too new for the courts to have been able to offer far-reaching guidance on data privacy online. Data stored outside of the country makes data privacy issues even more complex.

Information security. A report from the World Privacy Forum discusses the issues related to cloud computing and the privacy and confidentiality of information. According to the report, “for some information and for some business users, sharing may be illegal, may be limited in some ways, or may affect the status or protections of the information shared.”

Even when no laws prevent a user from disclosing information to a cloud provider, the report says, disclosure may still not be free of consequences. “Information stored by a business or an individual with a third-party may have fewer or weaker privacy or other protections than information in the possession of the creator of the information.” A cloud provider’s terms of service, privacy policy, and location may significantly affect a user’s privacy and confidentiality interests, the report states.

Data Security. There are many threats to data online. The application or service provider could go belly up, hackers could attack or just be locked out of your account. The good news is that data portability and security policies are being scrutinized closely by several organizations.

intensely naïve

Mr. Binstock observed that no cloud storage provider will promise that they will not access your data under any circumstances. It is also common to find explicit clauses that allow law enforcement agencies access to your data.

Believing that this is acceptable because there is nothing incriminating in one’s data storage, is, in his words, “intensely naïve.” The obvious problem, notes Mr. Binstock, is that any government agency examining your data is under no contractual obligation to you to keep them safe, or even delete copies that were created.

Neophobia

Chenxi Wang at Forrester noted that an effective assessment strategy must cover data protection, compliance, privacy, identity management, and other related legal issues. “In an age when the consequences and potential costs of mistakes are rising fast for companies that handle confidential and private customer data, IT security professionals must develop better ways of evaluating the security and privacy practices of the cloud services.”

Network. The idea of putting the network health in the hands of the ISPs is very troubling. Have you ever tried to work with an ISP to find out why your round-trip latency times are so high? can your organization confidently define: The bandwidth requirements of your apps? The end-to-end throughput needs? Where will your data really be? Will it take the same path today and tomorrow? Who will pick up the phone when you call to say “the cloud is slow?” Will you be able to understand them?

Complexity. As cloud computing evolves, “combinations of cloud services will be too complex and untrustworthy for end consumers to handle their integration,” according to a report from Gartner Inc.. Daryl Plummer, chief Gartner fellow notes:

ComplexityUnfortunately, using [cloud] services created by others and ensuring that they’ll work — not only separately, but also together — are complicated tasks, rife with data integration issues, integrity problems and the need for relationship management

Finances. Cloud computing changes the way software is purchased. The model for purchasing software one time and then choose to opt to buy the newer version a few years later maybe on the way out.  With cloud computing, the vendor can just raise the prices the following month. It requires a different mindset, of subscription fees as opposed to purchase. We will see how the public takes it.

These are some of the issues that must be addressed if companies are to decide that cloud computing offers benefits that exceed the ROI of providing similar services in-house without increasing risk.

rb-

Sure, “the cloud” will work for most people most of the time, but if there are a lot of users, there will be a lot of errors. With 100,000 users, 10% having problems over 10 years is 10,000 unhappy users.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

40 Years of Malware – Part 3

40 Years of Malware - Part 32011 marks the 40th anniversary of the computer virus. Help Net Security notes that over the last four decades, malware instances have grown from 1,300 in 1990, to 50,000 in 2000, to over 200 million in 2010. Fortinet (FTNT) marks this dubious milestone with an article that counts down some of the malware evolution low-lights.

The Sunnyvale, CA network security firm says that viruses evolved from academic proof of concepts to geek pranks which have evolved into cybercriminal tools. By 2005, the virus scene had been monetized, and almost all viruses developed for the sole purpose of making money via more or less complex business models. According to FortiGuard Labs, the most significant computer viruses over the last 40 years are:

See Part 1 HereSee Part 2 Here – See Part 3 Here  – See Part 4 Here

Code Red Worm2001 – E-mail and the Internet become primary transmission vectors for malware by 2001 as scripts automatically load viruses from infected Websites. The Code Red worm targeted Web servers and not users. By exploiting a vulnerability in Microsoft IIS servers Code Red automatically spread to nearly 400,000 servers in less than one week. The Code red worm replaced the homepage of the compromised websites with a “Hacked By Chinese!” page.  Code Red had a distinguishing feature designed to flood the White House Website with traffic (from the infected servers), probably making it the first case of documented ‘hacktivism’ on a large scale.

Shortly after the September 11 attacks, the Nimda worm (admin spelled backward) infected hundreds of thousands of computers worldwide. Nimda is one of the most complicated viruses, having many different methods of infecting computers systems and duplicating itself.

Microsoft SQL Server2003 – Widespread Internet attacks emerge as SQL Slammer (or Sapphire) infects the memory in servers worldwide, clogging networks and causing shutdowns. on January 25, 2003, Slammer first appeared as a single-packet, 376-byte worm that generated random IP addresses and sent itself to those IP addresses. If the IP address was a computer running an unpatched copy of Microsoft’s (MSFT) SQL Server Desktop Engine, that computer would immediately begin firing the virus off to random IP addresses. Slammer was remarkably effective at spreading, it infected 75,000 computers in 10 minutes. The explosion of traffic overloaded routers across the globe, which created higher demands on other routers, which shut them down, and so on.

The summer of 2003 saw the release of both the Blaster and Sobig worms. Blaster (aka Lovsan or MSBlast) was the first to hit. The worm was detected on August 11 and spread rapidly, peaking in just two days. Transmitted via network and Internet traffic, this worm exploited a vulnerability in Windows 2000 and Windows XP, and when activated, presented the PC user with a menacing dialog box indicating that a system shutdown was imminent.

The Sobig worm hit right on the heels of Blaster. The most destructive variant was Sobig.F, which generated over 1 million copies of itself in its first 24 hours. The worm infected host computers via e-mail attachments such as application.pif and thank_you.pif. When activated, the worm transmitted itself to e-mail addresses discovered on a host of local file types. The result was massive amounts of Internet traffic. Microsoft has announced a $250,000 bounty for anyone who identifies Sobig.F’s author, but to date, the perpetrator has not been caught.

Sasser shutdown2004 – The Sasser worm built on the autonomous nature of Code Red. It spread without anyone’s help by exploiting a vulnerability in Microsoft Windows XP and Windows 2000 operating systems called the Local Security Authority Subsystem Service or LSASS. Microsoft Security Bulletin MS04-011 here. This is the first widespread Windows malware, made even more annoying by a bug in the worm’s code, that turned infected systems off every couple of minutes.

This is the first time that systems whose function isn’t normally related to the Internet (and that mostly existed before the Internet) were severely affected. Sasser infected more than one million systems. The damage amount is thought to be more than $18 billion.

Bagle was first detected in 2004, it infected users through an email attachment, and used email to spread itself. Unlike earlier mass-mailing viruses, Bagle did not rely on the MS Outlook contact list rather it harvested email addresses from various document files stored in the infected computer to attack. Bagle opened a backdoor where a hacker could gain access and control of the infected computer. Through the backdoor, the attacker could download more components to either spy and steal information from the user or launch DDoS attacks.

MyDoom is another mass-mailing worm discovered in 2004. It spread primarily through email but it also attacked computers by infecting programs stored in the shared folder of the Peer-to-Peer software KaZaA. MyDoom slowed down global Internet access by ten percent and caused some website access to be reduced by 50 percent. It is estimated that during the first few days, one out of ten email messages sent contained the virus.

2005 – In 2005 Sony BMG introduced secret DRM software to report music copying; Other rootkits appear, providing hidden access to systems.

MyTob appeared in 2005 and was one of the first worms to combine a botnet and a mass-mailer. MyTob marks the emergence of cybercrime. The cybercriminals developed business models to “monetize” botnets that installed spyware, sent spam, hosted illegal content, and intercepted banking credentials, etc. The revenue generated from these new botnets quickly reached billions of dollars per year today.

rb-

By 2005 cybercriminals are starting to put all the parts together, Slammer proves that Microsoft systems can be used to spread attacks, Blaster and SoBig improved the infection rate, Bagel began to mine the targets for data and install backdoors so the attackers could continue to re-use the victims’ systems. MyDoom stated to use the first social network, the P2P networks for attacks. Sony proved that rootkits could be widely distributed and MyTob was the first of the modern botnet, leading the world into today’s monetized cybercrime age, described in part 4.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Super-Fi OK’d by IEEE

Super-Fi OK'd by IEEEI usually don’t have a problem getting a wireless signal where in my Bach Seat. However, there are some areas where I coordinate technical services that don’t get wired or wireless Internet. In these rural areas, where AT&T (T), Verizon (VZ), Sprint Nextel (S), and Comcast (CMCSA) and their fellow travelers fear to tread because they can’t make a buck in these areas, some help may be on the way from the IEEE.

IEEE logoIn 2009, the Institute of Electrical and Electronics Engineers (IEEE) started the development of IEEE standard 802.22, which addressed the need for broadband wireless access in rural areas, those where it is not economical to deploy a wired infrastructure. In July 2011, the IEEE announced that it has published the standard titled: “IEEE 802.22-2011 Standard for Wireless Regional Area Networks in TV Whitespaces” (PDF).

The IEEE press release states: “This new standard for Wireless Regional Area Networks (WRANs) takes advantage of the favorable transmission characteristics of the VHF and UHF TV bands to provide broadband wireless access over a large area up to 100 km (60 miles) from the transmitter. Each WRAN will deliver up to 22 Mbps per channel without interfering with reception of existing TV broadcast stations, using the so-called white spaces between the occupied TV channels.”That part of the spectrum, known as white spaces, sits between broadcast TV channels and will become available when broadcast TV stations switch from analog to digital in 2009.

VHF and UHF TV bands to provide broadband wireless accessThe White Space Coalition led by Microsoft (MSFT), Google (GOOG), Dell (DELL), and other tech titans strongly support the use of the white spaces in the U.S., going up against strong opposition lead by Michigan’s own John Dingell and big media like the NFL, MLB, NASCAR, NBA, NHL, NCAA, PGA Tour and ESPN who say unlicensed devices in the TV bands would interfere with their signals.IEEE 802.22 reportedly will not interfere with TV broadcasts, because it incorporates advanced cognitive radio capabilities including:

rb-

I met Mr. Dingell about a dozen years ago, at a school to encourage the politician to support schools when the USF started the eRate program for schools. I recall Mr. Dingell telling me he could not support eRate because he did not trust the FCC to get it right. At least he is consistent.

I believe there is a very good chance this technology will never be a commercial success. The wireless carriers will squash this technology like they have squashed municipal wi-fi and community fiber networks. The improved speeds and coverage areas are a threat to their limited 4G coverage and they would lose out on their monthly pound of flesh capped rate-limited data plan.

It will be up to us in the public sector to implement this technology for our clients.

What do you think?

Will Super-Fi ever see the light of day?

View Results

Loading ... Loading ...

Related articles:

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.