• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
Virtualization.com

Virtualization.com

News and insights from the vibrant world of virtualization and cloud computing

  • News
  • Featured
  • Partnerships
  • People
  • Acquisitions
  • Guest Posts
  • Interviews
  • Videos
  • Funding

research

Azure Uses Intel Virtualization Extensions To Counter Malware

July 22, 2008 by Robin Wauters Leave a Comment

—

Paul Royal, principal researcher at Damballa, has developed a new tool called Azure, which takes advantage of the virtualization extensions in Intel‘s chips to evade the virtual machine and sandbox checks malware authors often include in their ‘work’. Because the extensions exist at the hardware level, below the level of the host OS, the malware doesn’t have the ability to detect Azure, allowing researchers to analyze its behavior unimpeded.

“The whole point is to get out of the guest OS so the malware can’t detect you and attack,” said Royal. “Intel VT doesn’t have the weakness of in-guest approaches because it’s completely external. Others use system emulators, but to get everything exactly right in terms of emulation can be tricky.”

Royal plans to release the source code for Azure at the upcoming Black Hat conference in Las Vegas and will make the tool available for download, as well. Royal said he is still working on features that he plans to add to a future version of Azure, including a precision automated unpacker and a system call tracer.

Intel’s virtualization technology (VT) is a set of extensions added to some of the company’s chipsets that help implement virtualization on the hardware, rather than the software level. VT is designed to help enterprises make better use of their hardware resources and save energy.

[Source: SearchSecurity]

Filed Under: News Tagged With: Azure, Black Hat, Black Hat conference, Damballa, Damballa Azure, hardware virtualization, intel, Intel Virtualization, Intel virtualization extensions, Intel virtualization technology, Intel VT, malware, Paul Royal, research, security, virtualisation, virtualization, virtualization extensions

IDC Research Shows Strong Server Virtualization Adoption in Europe

July 7, 2008 by Robin Wauters 1 Comment

According to ONStor, Europe’s lagging in storage virtualization adoption. Recent research from IDC, however, shows the pace of adoption of virtualized servers is incredibly rapid among organizations that are using virtualization, with 35% of servers purchased in 2007 being virtualized and 52% of those bought in 2008 expected to be so. 54% of those not using virtualization expect to do so in the next 18 months.

“Virtualization use has exploded since our last survey of the European market,” said Chris Ingle, consulting and research director, IDC’s Systems Group. “Both large organizations and smaller businesses are using the technology for a wider range of applications and for business critical projects. As use of virtualization grows the challenges around managing complexity, finding skills and software licensing become more apparent”

Further Findings Include:

  • Organizations are increasing their virtualization of x86 systems for core business applications, although the majority of virtualization is still for test and development and for network server applications. Expertise and skills are the biggest barrier to virtualization adoption.
  • Growth of virtualization as a strategy remains strong, rising from 46% of the base to 54%. What is interesting is that virtualization is growing as a datacenter strategy in itself rather than as part of other projects. This supports the view that virtualization is increasingly seen as a standard for a wide range of workloads.
  • VMware is the clear market leader in providing virtualization technology with 82% of the sample using VMware. Despite high levels of Linux use, only 3% of the sample were using Xen as their virtualization platform. Microsoft was used by 13% of the sample base with various Unix technologies and mainframe accounting for 14%.
  • 59% of implementations have fewer than four VMs or partitions per physical box. The largest growth area for virtualization use over the past year, particularly in small and medium businesses, is improving disaster recovery, backup, and enhancing availability.
  • Availability of skills and application vendor licensing are the factors causing most problems for virtualization users. 23% of virtualization users report that their application vendors’ licensing is still not meeting their needs and 33% of large businesses report that it limits use of virtualization.
  • Despite seeing virtualization as a vital tool for their business, the majority of organizations do not measure benefits and use virtual infrastructure in the same way they do physical infrastructure.

The IDC study was first carried out in 2007 and has been repeated in Q1 2008 with a larger sample of organizations and a wider range of questions.

Filed Under: News Tagged With: adoption, Europe, IDC, research, server virtualization, virtualisation, virtualization, virtualization adoption

David Coyle, Gartner Researcher: The 7 Side Effects Of Lousy Virtualization

June 24, 2008 by Robin Wauters Leave a Comment

David Coyle, research VP at Gartner, detailed the seven side effects at the research firm’s Infrastructure, Operations and Management Summit, which drew nearly 900 attendees. While virtualization promises to solve issues such as underutilization, high hardware costs and poor system availability, the benefits come only when the technology is applied with proper care and consistently monitored for change, Coyle explained.

Here are the reasons Gartner says virtualization is no IT cure-all:

1. Magnified failures. In the physical world, a server hardware failure typically would mean one server failed and backup servers would step in to prevent downtime. In the virtual world, depending on the number of virtual machines residing on a physical box, a hardware failure could impact multiple virtual servers and the applications they host

2. Degraded performance. Companies looking to ensure top performance of critical applications often dedicate server, network and storage resources for those applications, segmenting them from other traffic to ensure they get the resources they need. With virtualization, sharing resources that can be automatically allocated on demand is the goal in a dynamic environment. At any given time, performance of an application could degrade, perhaps not to a failure, but slower than desired.

3. Obsolete skills. IT might not realize the skill sets it has in-house won’t apply to a large virtualized production environment until they have it live. The skills needed to manage virtual environments should span all levels of support, including service desk operators who may be fielding calls regarding their virtual PCs. Companies will feel a bit of a talent shortage when moving toward more virtualized systems, and Coyle recommends starting the training now.

4. Complex root cause analysis. Virtual machines move — that is the part of their appeal. But as Coyle pointed out, it is also a potential issue when managing problems. Server problems in the past could be limited to one box, but now the problem can move with the virtual machine and lull IT staff into a false sense of security.

5. No standardization. Tools and processes used to address the physical environment can’t be directly applied to the virtual world, so many IT shops will have to think about standardizing how they address issues in the virtual environment.

6. Virtual machine sprawl. The most documented side effect to date, virtual server sprawl results from the combination of ease of deployment and lack of life-cycle management of virtual machines. The issue could cause consolidation efforts to go awry when more virtual machines crop up than there are server administrators to manage them.

7. May be habit forming. Once IT organizations start to use virtualization, they can’t stop themselves, Coyle said. He offered tips to help curb the damage done from giving into a virtual addition.

[In large part thanks to NetworkWorld]

Filed Under: Featured Tagged With: David Coyle, gartner, Gartner Infrastructure Operations and Management Summit, Infrastructure Operations and Management Summit, research, side effects, virtualisation, virtualization

EMC Joins Daoli Trusted Infrastructure Research Project

June 16, 2008 by Robin Wauters Leave a Comment

EMC has joined the Daoli Trusted Infrastructure Project which conducts research into “trust and assurance” in cloud computing environments. EMC joins a growing global research team that today includes four of China’s leading technical universities including Fudan University, Huazhong University of Science and Technology (HUST), Tsinghua University, and Wuhan University. The team’s research will focus on cloud computing, trusted computing and virtualization.

Daoli Project

The research will explore a variety of techniques that could be applied to secure the underlying physical location as well as broadly shared resources.

“The team is exploring the convergence of several key technologies including cloud computing, trusted computing and virtualisation,” said Burt Kaliski, director of EMC’s Innovation Network. “It will look at how they might be applied to provide high-assurance software environments inside and outside the enterprise. The Daoli Project will help us understand what our customers are likely to encounter in the future, and we look forward to sharing the knowledge this research will generate.”

Participants will share findings with researchers worldwide by way of a wiki hosted by Tsinghua University in Beijing.

Details of the research are expected to be discussed at the third annual Asia-Pacific Trusted Infrastructure Technologies Conference in China in October.

[Source: VNUnet]

Filed Under: News, Partnerships Tagged With: cloud computing, Daoli, Daoli Project, Daoli Trusted Infrastructure Project, EMC, Fudan University, Huazhong University of Science and Technology, research, research project, trusted computing, Tsinghua University, virtualisation, virtualization, Wuhan University

Virtualization Picking Up More Steam in Asia Pacific

June 6, 2008 by Robin Wauters Leave a Comment

Research from Springboard had already estimated the virtualization software and services market in Asia Pacific to reach $ 1.35 billion with CAGR of 42 % by 2010, here’s another indication that Asia is poised to become the biggest growth market for virtualization in the future, particularly with financial institutions.

According to The Asian Banker, the Asia Pacific region is now adopting virtualization technology at a faster rate than the U.S. -14 percent of companies in Asia plan to implement virtualization in the next 12 months, compared with 12 percent of companies in America says Forrester.

According to Jim Lenox, VMware’s South Asia general manager, the motivation for adopting virtualization has varied significantly across Asia’s banking sector, with consolidation and cost savings as the “first-step” goal.

“After the organization achieves some experience running virtual machines, we generally see them expanding their usage of virtualization largely due to the agility and flexibility that comes with running applications inside of easy-to-create virtual machines.”

Virtualization is often cited as one of the solutions to improve banks’ disaster recovery capabilities, an issue that has come into the spotlight following recent trauma in China and Myanmar.

Filed Under: News Tagged With: Asia, Asia Pacific, banking, research, Springboard, Springboard Research, virtualisation, virtualization, virtualization Asia, virtualization Asia Pacific

IBM Unveils Research Initiative PHANTOM, Aims To Protect Virtual Servers Better

April 10, 2008 by Robin Wauters Leave a Comment

IBM recently announced a breakthrough in safeguarding virtual server environments and introduced new software to help businesses better manage risk. The company said the advances can provide businesses with substantial improvements in securing information, applications, and IT infrastructures around the globe.

IBM logo

IBM, the company that pioneered the concept of virtualization with its mainframe systems, is tackling the security issue with Project PHANTOM, an initiative that’s so secret that IBM won’t even say what the name means. This is part of the announcement that was made:

IBM’s PHANTOM initiative aims to create virtualization security technology to efficiently monitor and disrupt malicious communications between virtual machines without being compromised. In addition, full visibility of virtual hardware resources would allow PHANTOM to monitor the execution state of virtual machines, protecting them against both known and unknown threats before they occur. It is also designed to increase the security posture of the hypervisor — a critical point of vulnerability; because once an attacker gains control of the hypervisor, they gain control of all of the machines running on the virtualized platform. For the first time, the hypervisor — the gateway to the virtualized world and all that lays above it — can be locked down.

Ars Technica had a call with the people at IBM. The company was still not willing to talk in any detail about it, but I did learn some important information that answers the questions I raised in my original post, which I’ve included below in its own section.

For starters, PHANTOM is not one particular technology, but rather a widespread research initiative within IBM that will eventually result in a range of products, services, best practices whitepapers, etc.. The initiative was started two years ago as a collaboration among various hardware and software groups within IBM, and has since expanded to embrace some third parties whose identities IBM isn’t revealing just yet. The internal groups involved in the initiative include IBM’s X-Force Threat Analysis Service (a division of IBM’s Internet Security Systems), IBM Watson research center, and the server platform groups behind the z- and p-series servers, among others.

IBM stressed to me that the initiative will produce results for a wide variety of hardware/software combinations, including x86 systems, Windows, Linux, POWER, and others. So the scope of PHANTOM, broadly defined, includes all virtualization platforms, products, and services.

Clearly, whatever else it is, PHANTOM is also extremely ambitious. It’s also still mostly under wraps, so we’ll have to wait for more announcements before giving further details.

Filed Under: News Tagged With: IBM, IBM PHANTOM, PHANTOM, Project PHANTOM, research, virtual server, virtualisation, virtualization, virtualization security

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to page 5
  • Go to Next Page »

Primary Sidebar

Tags

acquisition application virtualization Cisco citrix Citrix Systems citrix xenserver cloud computing Dell desktop virtualization EMC financing Funding Hewlett Packard HP Hyper-V IBM industry moves intel interview kvm linux microsoft Microsoft Hyper-V Novell oracle Parallels red hat research server virtualization sun sun microsystems VDI video virtual desktop Virtual Iron virtualisation virtualization vmware VMware ESX VMWorld VMWorld 2008 VMWorld Europe 2008 Xen xenserver xensource

Recent Comments

  • C program on Red Hat Launches Virtual Storage Appliance For Amazon Web Services
  • Hamzaoui on $500 Million For XenSource, Where Did All The Money Go?
  • vijay kumar on NComputing Debuts X350
  • Samar on VMware / SpringSource Acquires GemStone Systems
  • Meo on Cisco, Citrix Join Forces To Deliver Rich Media-Enabled Virtual Desktops

Copyright © 2025 · Genesis Sample on Genesis Framework · WordPress · Log in

  • Newsletter
  • Advertise
  • Contact
  • About