31 January 2011

Meraki Debuts Networking as a Service Pricing Program

Eliminating upfront hardware costs for the users, Meraki has announcedNetworking as a Service, a new pay-as-you-go pricing model for cloud-managed network infrastructure products.

Like the Amazon Web Services benefits the datacenter, orSalesforce.com (News - Alert) improves CRM, Networking as a Service brings the cost savings and flexibility of the cloud to enterprise networking. It has no upfront hardware costs and provides improved cash flow and allows the users to grow, shrink, or upgrade at any time.

Networking as a Service provides various advantages to the users when compared to the traditional capex models. Enhanced with the help of cloud computing, this is the ideal way to pay for IT expenses in the modern world.

"Meraki's Networking as a Service program provides a cost-effective and convenient way for organizations to obtain a Meraki system on a 'pay as you go' model," said Hans Robertson, VP of product management and co-founder at Meraki. "Networking as a Service eliminates upfront capital expense, gives you the option to upgrade your hardware for no cost at any time, and removes the financial risk of owning too much infrastructure."

The new solution allows the users to pay only for what they use, with no commitment. It also comes with maintenance, support, and upgrades included. With wireless LANs starting at $25 per access point per month and branch routers from $35 per router per month, Networking as a Service is available for all Meraki enterprise wired and wireless products. The subscription fee includes all hardware, software licenses, support, maintenance, and upgrades, the company has stated.

In December 2008, the company started shipping a solar-powered Wi-Fi mesh device that, according to the manufacturer, will make wireless networks energy- independent. Meraki's device is likely to become popular among networking gear vendors in emerging markets, like India and China, where power outages often shutdown wireless towers. In advanced countries like the U.S., the solar-powered device removes the need to hire an electrician to set up Wi-Fi on rooftops.

28 January 2011

2 Wi-Fi vendors adding cloud-based networking

Two wireless LAN vendors are adding wired connectivity to their Wi-Fiofferings, to create managed, cloud-based networking services for a range of enterprise customers.

In effect, these new services from Meraki Networks and Aerohive Networks will let enterprises "rent" secure wired and wireless networking services, administered via an online management interface, without having to invest capital dollars in hardware and software.

COX ON WIRELESS: 'Stealing' Wi-Fi- isn't about Wi-Fi

The services from both vendors are designed to be simple to deploy, set up and use. For both companies, a prime target is the small and medium business market, and highly distributed companies with branch offices.

That's the same wireline market targeted by Cisco, with its highly successful Integrated Services Router (ISR) product family.


Meraki has unveiled a family of wireline network routers. Like Meraki's Wi-Fi access points, the routers will be managed and secured by a cloud-based set of services accessed via a Web interface. Enterprise customers will be able buy and adjust a network infrastructure on a pay-as-you-go basis for an annual fee.

The Meraki routers combine four distinct functions traditionally handled by separate appliances: routing, application firewall, site-to-site VPN, and network monitoring. In addition, customers that have been relying on leased lines or similar WAN arrangements will be able to substitute for redundant cable modem or DSL connections for still more operational savings, according to Meraki executives.

The Meraki MX50 is for small branch offices, retail stores, and the like; it's priced at $400 per year, including maintenance. The Meraki MX70 is aimed at medium to large branch deployments, priced at $800 per year. Each will have a choice of two software "editions": the complete package includes all the advanced security features, including a full firewall; a second edition, lacking these features, is intended for use behind an existing enterprise firewall, and will cost less.

Both will be available starting Feb. 13.


Aerohive Networks is taking a different route -- by acquisition -- to a similar goal. The Wi-Fi vendor announced this week it has bought Pareto Networks, a privately held Sunnyvale, Calif., company. Pareto released in June 2010 a subscription networking service (based on several pending patents) aimed at mid-sized companies, and branch offices. The service included an on-site router, with optional 3G or 4G interfaces for wireless backup, VPN, SSL, a proxy infrastructure, and a Web-based management application. There were no capital costs, just the all-inclusive monthly fee.

In fall 2009, Aerohive released its own Web-based wireless LAN offering, HiveManager Online, a cloud-based version of the WLAN management application to manage Wi-Fi networks in branch offices and small to medium businesses. Aerohive's WLAN architecture eliminates separate controllers by in effect distributing control functions to the access points and the cloud-based management software.

Random Observations from the Cloud, Opus 1

Cloud Computing Journal
Random observations from the World of Cloud: A friend of mine who serves as the chief architect of a small start-up currently in beta in the SF Bay Area tripped over an unexpected problem with video posting at the company's cloud services provider. The company transcodes member videos, then posts them. They've been tweaking the process to eliminate occasional wobbles in the final files. But re-posted vids had the same problems as the originals, which was confounding until they learned there was a 24-hour delay until vids go live, written into their SLA. They revised it to reduce the delay to one hour, but must still work around this...and consider what it means when membership ramps up. Open the TwitDoor, HAL I'm becoming more dependent on Twitter to refer people to my stories, to shout out to folks, and to post the occasional caustic barb when I've had too much coffee. But HAL (as I, and no doubt numerous others, refer to Twitter) has always been a little pissy and uncooperative. Now He's imposed the new Twitter on me--which I still don't like. Then again, I'm still not down with Mountain Dew coming in cans rather than the original green bottles, so maybe I'll adjust to #newtwitter. But one thing for sure, HAL stutter-steps his way through my new timeline in a way that makes it almost unusable. Kicking Down the Door I don't understand why HAL hasn't offered hosting and applications, and tried to lure me in to use Him as my one-stop online cloud-based environment. I might even pay a low monthly fee if I knew the service would be fast and ad-free forever. But this would mean that I have, in fact, ceded control of my cyberlife (which is almost the same thing as my real life) to computers that are far, far away. This aspect of Cloud is why commentators such as John C. Dvorak have expressed a loathing for it. And knowing with 100% certainty that a brown-shirted government will try to bully Twitter at any time into giving up whatever they have is absolutely chilling. This represents the biggest threat to Cloud Computing today. Technical issues over wobbly vids or stuttering timelines can be resolved; unwarranted government intrusion cannot.

read more

27 January 2011

Securing the Cloud an Impossible Feat? Think Again

A virtualized data center must be supported by a virtualized security system, which must be validated by a virtualized test systems and test methodologies.

The rapid rise of cloud computing has delivered cost and productivity benefits to thousands of organizations as over 200 cloud providers have emerged in the last decade. But questions of cloud security reveal that the growth of the networking and computing capabilities has outstripped the development of technologies to protect the cloud from cyber attacks.

Greg Day, security analyst at McAfee, told ComputerWeekly.com, "As cloud computing gains popularity, cyber-criminals are likely to target these services to steal information for financial gain."

At the heart of the issue is virtualization, the ability to run multiple server instances inside virtual machines (VMs) on a single physical server. This basic element is both the foundation of cloud computing and the source of new vulnerabilities that are already being exploited.

At an RSA security conference in San Francisco, John Chambers, Chairman and CEO of Cisco Systems, said that while cloud computing posed exciting opportunities, "It is a security nightmare and it can't be handled in traditional ways."

Traditional vs Virtual Security
When implemented and configured correctly, current cyber security solutions do a good job of detecting and blocking a wide range of malicious traffic from outside and even inside the data center. This is true because mature technology underlies security applications like intrusion detection systems (IDS), intrusion prevention systems (IPS) and deep-packet inspection (DPI).

Validation is the essential element in the technology cycle that drives maturity. Current security technology reached maturity through the iterative development of test methodologies that assessed and validated specific implementations. As we shall see, cloud-aware test methodologies are the key to bringing security to cloud computing.

Some may assume that existing security solutions are adequate to protect the cloud. After all, the virtual servers reside on physical servers that are behind the firewall. To see why this is not the case, we must look at the relationship between virtualization and security, more specifically, where security is traditionally implemented in a data center.

Security typically sits at the border of the LAN and WAN, protecting the data center infrastructure from threats. A firewall inspects all incoming and outgoing traffic, passes through legitimate traffic and blocks malicious traffic from the outside. In addition, a firewall can sit at the top-of-rack or end-of-row, monitoring traffic on the LAN to detect and contain inter-server threats from spreading through the LAN. These could be attacks that somehow got past the firewall or threats introduced internally, either unconsciously by uploading an infected file or intentionally through sabotage.

In the typical scenario, it is not feasible to deploy an IPS in front of every server. The best that can be done is to have an IPS per row or per rack and attempt to contain inter-server threats within a small segment of the data center. In addition, nothing sits inside a server, detecting and stopping anintra-server threat, whether it is a hacked hypervisor or a rogue VM attacking and infecting other VMs in the same server.

For example, a compromised VM could send counterfeit transactions, destroying the integrity of back-end databases. Since all the traffic that leaves the physical server appears legitimate, traditional security systems can't detect and stop this breach.

Infra/Inter/Intra Vulnerabilities
Traditional data centers have inter-server and infrastructure vulnerabilities, such as the possibility of performance and security weaknesses internally between servers, externally at the gateway, and in the end-to-end network. Virtualization intensifies these potential threats and adds another level of vulnerability, intra-server, i.e., threats between VMs inside a single physical server.

Traditional end-to-end testing validates the performance of an entire system. System testing is even more important in the era of virtualization. With dozens of VMs per physical server, the amount of traffic one box can generate increases dramatically, easily filling a 10 Gigabit Ethernet link. The cloud can be composed of hundreds or thousands of physical servers.

Device testing evaluates the performance of a device interacting with other devices. For example, testing a security appliance involves sending legitimate traffic mixed with malicious traffic to the appliance and evaluating its ability to deflect threats while forwarding legitimate traffic at acceptable levels. The increase in utilization due to virtualization means an increase in traffic, placing more demands on the performance of the security appliance.

Now that we have multiple applications running in separate VMs on a single server, we have the possibility of security threats residing completely inside a physical server. Intra-server traffic never sees the network, so traditional methods of implementing and testing security are completely ineffective for intra-server threats. If a rogue application is spawned in a VM and launches a DOS attack on other VMs on the server, a software appliance in the DMZ will never know.

Virtual Security for Virtual Machines
Traditional security approaches are inadequate to protect the cloud because they can't detect and deflect intra-server threats. Virtual machines require virtual firewalls.

A virtual IPS performs the same functions as a physical IPS. The difference is where it is located. In the case of a virtual appliance, it resides in a service VM on the physical server along with the application VMs. A redirect policy allows a virtual controller to inspect and control VM-to-VM communications and direct the traffic to the appropriate appliance, whether physical or virtual. This arrangement places a virtual IPS in front of every connection to allow the traffic to and from every VM to be inspected.

A cyber security system that combines physical IPS appliances with virtual IPS appliances has end-to-end visibility of the data center network, from the DMZ at the demarcation point to every VM in every server, and all devices of interest in between.

Metrics of Virtual Service: PASS
Here is where cloud-aware test methodologies come into play. Like the traditional data center, the virtualized data center has fundamental and critical network attributes - performance, availability, security, and scalability (PASS). Established test methodologies answer the critical questions related to the PASS attributes. However, virtualization fundamentally changes the environment that these methodologies address.

Traditional over-provisioning methods of fixed resources - physical servers, storage drives, network switches-no longer apply in the virtualized environment. At the service level, the cloud designer must take this into account by ensuring an adequate number of VM instances are provisioned to make dynamic access possible for all users. Cloud security must deliver the maximum number of new connections per second and firewall bandwidth throughput while blocking threats and malicious traffic.

The traditional methods of providing local redundancy must also be reconsidered in a virtualized environment. Servers that can support 1,000 or more VMs can become a single point of failure if appropriate approaches to VM load balancing, automated resource scheduling and live migration to other hardware are not built into the design. Cyber security in the cloud requires maintaining optimum application response time at maximum throughput.

Traditionally, cyber security is placed in strategic physical locations, such as at the WAN edge where requests and traffic from the Internet can be filtered and decrypted. However, geographic locations of physical servers have less meaning in a virtualized cloud, as users might be tapping resources from VMs located on one of any number of servers or even data centers. Virtual security must be cloud-aware. In the case of live migration, where a VM moves to another server with VMotion, the security solution must migrate the profile to allow legitimate traffic access to the new physical machine to avoid downtime for the end user.

The promise of infinite scale is appealing, but the elasticity of the physical infrastructure has finite limits. Addressing this risk requires a well-thought-out network infrastructure where aggregation and core interconnects do not become the bottlenecks of the elastic demand and scale that the cloud promises, maintaining the maximum number of secure concurrent connections at maximum throughput.

Virtual Test Systems for Virtual Security
For both traditional and virtual data centers, testing answers questions related to PASS. In particular, testing provides the answer to the question: How secure is any given cloud? Testing a cyber security solution addresses two vital questions at a high level:
Does the solution block all threats while allowing legitimate traffic to pass?
How does the solution affect throughput, performance and scalability?

Answering these questions is the goal, whether testing a legacy data center or a virtualized data center. Like the virtualization of a security application, the innovation of testing virtualization lies in extending the test endpoints.

As the world of computing has employed the VM to provide the many benefits of cloud computing, test systems have extended to the virtual level to validate the functionality of applications running in the VMs, and through the iterative development process, to facilitate improvements in performance, availability, security, and scalability, the critical metrics of data center efficiency.

A virtual tester is a software-based test system implemented in a virtual machine. To the network devices under test, and to the test engineer, it looks and behaves exactly as if it were a hardware tester. A virtual tester makes it possible to test cloud security at all the levels it has impact: intra-server, inter-server and infrastructure.

When assessing a cyber security system that employs virtual and physical appliances, testers reside at the endpoints to generate traffic and accumulate results.
Intra-server: Virtual testers for each VM in the physical server serve as endpoints.
Inter-server traffic: A virtual tester for each VM in the separate physical servers can serve as endpoints, or a virtual tester on one end and a physical tester on the other.
Infrastructure: Virtual testers for each VM in the test serve as endpoints and a physical tester at the gateway serves as the other.

The result is end-to-end testing of any IDS/IPS scenario, whether the endpoints span the whole of the data center or reside in a single physical server.

A recent test conducted by Broadband Testing demonstrated the use of cloud-aware PASS methodologies to validate a cloud-aware cyber security solution.

Cloud computing offers tangible benefits for increasing efficiency and reducing capital and operating costs for enterprises and other organizations, but security issues have the potential of negating those benefits. A virtualized data center must be supported by a virtualized security system, which must be validated by a virtualized test systems and test methodologies.

Improving Cloud Adoption Rates Through User Experience

As product manager at ScaleUp, one of my top jobs is to make sure our cloud management platform has as much impact as possible at what we call the cloud "point of purchase".

This is that magical spot where the consumer and provider meet. It's where consumers locate, order and manage the resources they need. It's the spot where providers manage their users, offer capacity, manage and monitor those resources, charge for them, enforce and apply automation, governance, security and other business rules and ultimately provide a service. In other words, there's a lot going on at the point of purchase.

It doesn't matter if the provider is an enterprise IT department in a private/hybrid cloud or if they are a tiny MSP offering public cloud or anything in between. It also doesn't matter if the user is a business IT user sitting in a cubicle farm or if they are a developer in a garage somewhere - the issues are the same often just with different labels.

Here are just a few of the things we consider every time we want to add something to our platform...


Is not a cloud expert
In general, does not know (or care) about how or why things work
Does not have the desire or time to learn a complex system/process
Wants a single, integrated platform for their IT resources and activities


Needs to support a wide range of use cases and user types
Has great technology inside the datacenter, that is their primary focus
Wants to offer complex technology in a simple, self-service manner

Since we launched our cloud management platform two months ago, I spend a good amount of time showing people how we can simplify how they provide and consume cloud services. The response has been fantastic, and the elegant user experience we have created on both sides and in the "point of purchase" is accelerating stalled cloud projects and creating new ones for both providers and consumers. By solving the user anxiety about consuming cloud resources and how they will manage them, enterprises and MSP's are moving forward with cloud products at an accelerated place.

The moral of the story is that while everyone is so focused on what's happening inside the datacenter, perhaps the most important missing link to improving cloud adoption rates in 2011 is what lies outside of the datacenter - the user experience.

26 January 2011

Cloud App Integration: What's the Best Path?

There's a lot of noise from vendors of every stripe about the cloud. Unfortunately, in the vendors' efforts to show how all their products are cloud-based, there's a lot of blurring about the specifics of what it means to be a cloud application. Consequently, this article will apply differently to every cloud vendor. (And for the purposes of this article, let's keep the discussion to SaaS and cloud-based apps from a vendor or integrator, not ones you build yourself, although some of the same principles apply.)

Cloud Computing: 2011 Predictions

With that disclaimer behind us, one of the distinguishing characteristics of cloud software is the variety of ways it can be integrated. As most cloud applications present themselves as a series of Web services, they lend themselves to a service-oriented architecture, even if they don't follow all the SOA protocols. With the right toolkits and development attitude, you can integrate cloud applications with a variety of techniques...and use as many of them concurrently as you like, even in the same application. Of course, you have to understand the limitations of each approach—but there's nothing wrong with getting things done quickly. Let's look at this as layers of an onion.
Layer 1: On-Screen Integration

Otherwise known as mashups, this style of integration is the ultimate in quick and dirty. The coding exercise is the construction of iFrames for the screen layout and URLs with lots of parameters for grabbing the goodies from the other cloud. This is the baseline method for pulling images, maps, news items, and data feeds from publicly available services like Google (GOOG) or Yahoo (YHOO). This method will become increasingly powerful (particularly for demos) as graphing packages and other document services become commonplace as cloud services. AJAX can give the pages a modern, intuitive, and responsive UI. Unfortunately, mashups don't inherently offer much in the way of security, so you'll have to look at tricky coding practices and server-side validation (for example, here) for sensitive data, and you'll probably want single sign-on or other authorization infrastructure to control access without irritating users. So the tradeoff at this layer is: simple code and read-only, or secured with complex code.
Layer 2: Presentation Layer Integration

Depending on the way your cloud application generates Web pages, you may have a programming layer on the server side which provides fertile ground for cloud integration. (In contrast, the mashup strategy works almost entirely in the browser.) While the mashup strategy is great for stitching together entire segments of a page (e.g., adding a map or graphic to a layout), integrating at the presentation layer shines in its ability to add individual fields within a section of a page. For example, it would be nice to add an indication of "how many days overdue is a customer payment" to the summary area of the CRM account page, but this field might only be available in your accounting system. Pulling this in at the presentation layer gives the users what they need to see, and is faster than doing a full-blown integration.

Of course, the strength of this approach is also its weakness: that payment overdue indicator would not be stored anywhere in the CRM system, so it wouldn't be available to support reports, alerts, or other functions. This approach is usually used for read-only data, as the presentation layer may not have the kind of security infrastructure available in the rest of the system. It all depends on the language you're using and the Web service security libraries available — but it usually doesn't make sense to attempt complex security mechanisms when integrating at the presentation layer.

Cloud Breaches Show Need for Stronger Authentication

As organizations increase their reliance on cloud-based services, collaboration tools and enabling users to access networks, the number of security breaches is on the rise. A new study by Forrester Research shows that more than half of the 306 companies surveyed (54 percent) reported a data breach in the previous year.

Also see our "Cloud security survival guide"

Even with the growing security threats, most enterprises continue to rely on the traditional username and password sign-on to verify a user's identity, rather than strong authentication, according to the study.

Cloud Computing: 2011 Predictions
Defining Cloud Security: Six Perspectives

The report, "Enhancing Authentication to Secure the Open Enterprise," was conducted by Forrester late in 2010 on behalf of Symantec Corp. (SYMC) The vendor wanted to evaluate how enterprises are evolving their authentication and security practices in response to changing business and IT needs as exemplified by cloud and software-as-a-service (SaaS) adoption, the business use of Web 2.0 services, and user mobility trends.

Password issues are the top access problem in the enterprise, according to the study. Policies on password composition, expiration, and lockout that are put in place to mitigate risk have become a major burden to users, impeding their ability to be productive. They also result in help desk costs due to forgotten passwords.

The Forrester study recommends that organizations implement strong authentication throughout the enterprise, not just for select applications.

Mauricio Angee, VP and information security manager at Mercantil Commercebank N.A., agrees that passwords have become a problem.

"Today, there is a high percentage of calls and service requests related to password resets in our environment," Angee says. "Two-factor authentication has been implemented for network sign-ons, in addition to the deployment of single-sign-on, which has helped us [reduce] the amount of password management."

The concern with passwords, Angee says, "is that we have given the user the responsibility to change passwords, remember long complex pass-phrases, secure PINs, carry tokens, etc. This is a practice that has proved to be a huge weakness to keep our environments secure, not to mention the huge challenge to information security professionals who have to enforce policies and maintain an expected level of security."

Moving the entire infrastructure to strong authentication requires time and resources dedicated to assessment, analysis and testing systems and applications in order to determine if these systems have the capability to be integrated, Angee says. "Often, constraints are found, mostly with legacy systems, which has been the major [reason] to avoid moving forward with strong authentication. This is definitely an initiative we will be focusing our efforts to determine the feasibility, impact, and the ROI."

21 January 2011

Interested in Location Services? Join us Friday for a live demo.

Interested in Location Services? Join us Friday for a live demo.

Meraki » The Official Meraki Blog

Many of you have expressed interest in our new Location Services, which allow enterprise customers to determine the location of WiFi clients without additional hardware. We're therefore holding a quick, informal webinar on Friday, during which we'll talk a bit about how this feature works under the covers, do a live demo, and hold Q&A. The webinar runs just 15 minutes, so it's a great quick way to learn about this new feature. You can register (for free, of course) here.

We hope you will join us!

Join Us: http://bit.ly/joincloud

Cloud Distribution Ltd, 1210 Parkway, Arlington Business Park, Theale, Reading, Berkshire RG7 4TY

Gartner Positions OpSource as a 'Challenger'

Gartner Positions OpSource as a 'Challenger'Jan 13, 2011 (Close-Up Media via COMTEX) -- OpSource, Inc., a provider of enterprise cloud and managed hosting solutions, announced it has been positioned by Gartner, Inc. in the Challengers quadrant of the "Magic Quadrant for Cloud Infrastructure as a Service and Web Hosting" 2010 report.

According to Gartner analysts and report authors Lydia Leong and Ted Chamberlin, "For the past five years, the Web hosting market has been evolving toward on-demand infrastructure provisioned on a flexible, pay-as-you-go basis; the majority of hosting customers now obtain at least some of their infrastructure on demand, and most new hosting contracts include on-demand services. The market for traditional Web hosting services, especially for Internet and intranet Web content and applications, continues to grow." "We believe our placement in the Challengers quadrant is a confirmation of our rapid growth over the past year as IT organizations have consistently selected OpSource's enterprise-ready hosting and cloud IaaS solutions for cost-effectively scaling mission-critical applications in the cloud," said Treb Ryan, CEO for OpSource. "By leveraging our deep expertise in managing complex hosted environments, we are able to customize our offerings to meet the needs of businesses of any size. With the recent launch of OpSource's Managed Services for the Cloud offering, we continue to break down the barriers to adoption for enterprise companies and smaller businesses migrating applications to the cloud." According to a release, OpSource is a public cloud provider offering both application and system management services in the cloud. With the company's continued product innovation and positioning in Garter's Magic Quadrant, OpSource demonstrates its growth in the cloud Infrastructure-as-a-Service market.

20 January 2011


Expand Networks, (www.expand.com) the leader in WAN optimization for branch office consolidation and virtualization, today announced that Porsche Informatik, a division of Porsche Holding Austria, has implemented its advanced WAN optimization technology. The solution is accelerating and prioritizing critical business applications between its headquarters in Salzburg, and its subsidiaries, corporate customers and dealerships across 16 European countries. Successfully enhancing user experience and increasing business productivity, the $550,000 project is set to pay for itself in just 18 months and could deliver up to $700,000 in cost saving over three years.

Porsche Informatik relies on its WAN for running critical business processes such as vehicle ordering, order tracking, delivery and customer financing. Robert Singer, project leader for the Expand rollout at Porsche Informatik, comments:

"Fast and reliable communication between our importers, dealerships and headquarters is critical for a streamlined supply chain. However, we found our existing bandwidth provision struggling to cope with the demand for applications and data travelling over the WAN. Lotus Notes and SAP, critical for ensuring accurate and up to date financial data, were particularly problematic; regularly saturating the network at peak times. With limited bandwidth available in many of our locations, we set about evaluating optimization solutions in order to reduce congestion over our WAN links and get more out of the network."

Having installed Juniper's WAN Optimization (previously Peribit) in 2004, Porsche Informatik found the ever increasing demands on the network outstripped the benefits the solution provided, resulting in a continuously degrading network performance across its distributed environments. The company chose to replace the legacy Juniper solution with Expand Networks following a thorough evaluation of the WAN optimization market, which included Riverbed, BlueCoat and Cisco WAAS.

Singer continues: "What Expand offered us was a way to increase traffic on our WAN links in the most cost effective manner. We quickly decided that Expand's superior byte-level caching techniques and ability to deliver a rapid ROI proved the best solution from a technological and cost perspective."

Porsche has completed the implementation of Expand Accelerators across 120 sites, with 40 more to follow before March 2011. The company also implemented ExpandView, Expand's centralized management platform, at its Headquarters. According to Singer, this sped up the roll-out significantly, "By using ExpandView, we were able to pre-configure all of the Accelerators centrally. It was practically plug and play, which saved us both time and money and enabled us to experience the benefits almost immediately."

Now in place, the compression capabilities of Expand have delivered significant increases in bandwidth, helping to manage peaks in demand and protect business critical applications from congestion, whilst its byte level caching techniques have reduced repetitive traffic flowing through the network. Singer comments:

"Users have responded positively to the Expand solution, stating that key applications were running at greater speed and their corresponding output had increased dramatically. The acceleration techniques of Expand have helped overcome the latency we were previously experiencing - Lotus Notes has seen a 400% (best value) increase in performance, and SAP over 800% (best value). As we extend our operations across Eastern Europe and other regions, the solutions' ability to grow with us, and its ease of use and configuration, will be a huge benefit moving forward."

Christian Honore, Vice President of Sales EMEA at Expand Networks, concludes: "The Accelerators area designed to increase the efficiency of an enterprise's network without increasing overall IT costs. This enables large organizations like Porsche Informatik to quickly and easily extend the capacity of their networks, improving the user's network experience and productivity, and ultimately improving the overall response to customers."

Expand Networks Heralds Breakthrough Year for its Virtual Accelerator Sales and Cloud/Managed Service Provider Solutions

Expand Networks Heralds Breakthrough Year for its Virtual Accelerator Sales and Cloud/Managed Service Provider SolutionsExpand Networks RSS

Roseland, NJ – 5th January 2011: Expand Networks (www.expand.com), the leader in WAN optimization for branch office consolidation and virtualization, today announced its continued growth and market momentum during 2010, having achieved another breakthrough year in which it cemented its position as an innovator and market leader, grew its datacenter business, increased its enterprise scale, and established new telco and service provider revenue streams.

Key annual highlights:

Expand now owns one of the largest and mature installed bases in the market
Boasts total deployed market share of 60,000 units
Global customer base hits 5,000+ customers
Increased demand for Expand's Virtual Accelerator (VACC) within the Datacenter

Reflects growth in datacenter virtualization strategies
VACC offers easy install and management with VMware VSphere support
Rise in high-end enterprise deployments
Shift from tactical to strategic deployments, recurring orders from existing customers
Multiple 100 plus site customer purchases, multi-million $ deals
Unique cloud offering and commercial model for telcos and managed service providers

Service providers embedding Expand technology as part of their service offerings
Pay-as-you-go subscription model, WAN optimization-as-a-Service
900 site managed services customer win with Spar Retailer

"Despite continued economic uncertainty, 2010 proved to be another stellar year for Expand where we managed to bolster and grow our focus markets as well as breaking out into new sectors such as telcos and service providers," explained Elie Barr, CEO of Expand Networks.

"Our innovation around Cloud based offerings and radical new commercial model and partner program for telcos and services providers fundamentally changed how they procure WAN optimization solutions on a de-risked, 'pay as you sell' basis, and is already generating tremendous results, pushing Expand right to the forefront of this growing market opportunity."

Expand also continues to achieve extraordinary levels of growth within its focus enterprise marketplace by enabling organizations to successfully execute on key IT initiatives such as branch office server consolidation, server based computing, optimized satellite communications, virtualization and virtual desktop infrastructure (VDI).

"Here, we've witnessed a significant shift in the size of organizations turning to WAN optimization to enable their strategic initiatives," continued Barr. "The tangible benefits of our technology gained significant mindshare in the high-end enterprise in 2010, firmly moving it from a branch office 'quick-fix' to a strategic company-wide business imperative. We completed some of our largest installations to date last year and added a raft of blue chip enterprises and service providers to our customer base. 2010 has been a year where we've turned technical innovation into market penetration."

Other 2010 highlights include:

Customer acquisition

Porsche Informatik - Accelerate and prioritize business critical applications across 16 European countries.

US Military - Improve performance of critical applications and communications for mobile warfighter units and remote executive travel teams.

SPAR Group - With Telkom SA, deployed WAN optimization as a managed service to accelerate applications across 900 national sites in just 6 months.

Christian Aid - Enhance connectivity and communications across its global network to help provide urgent, practical and effective assistance to some of the world's poorest countries.

DAPA (Defense Acquisition Programme Administration - South Korea) - Deployed over 350 Accelerators across its armed forces to enable fast and reliable connectivity for all army, navy, air force and special forces whether on land, at sea or in the air.

Strategic Partnerships and Alliances

HP AllianceONE - Gained network specialization partner status enabling customers to accelerate application deployment and optimize infrastructure capacity.

Inmarsat Connect Partner – Validation by the world's leading provider of global mobile satellite communications services and access to its global network of distribution partners.


AOS 6.3 – Enhanced feature set including Layer 7 QoS, dynamic SCPS support for satellite links and Microsoft RDP proxy.

ExpandView Virtual – Fully featured centralized management system that includes global deployment manager, centralized management and WAN application monitoring system within physical or virtual environments.

Accelerators 3830 and 3930 – Industry's most compact form factor outscaling competitive offerings to set new standards for price/performance at the branch office.

Global Expansion

Expand made a number of senior executive appointments in 2010 to oversee the Company's growth and investment in DACH, Southern Europe, Benelux, Latin America and APAC. Key appointments include:

Christian Honore – Vice President of Sales, EMEA
Markus Richter – Regional Manager, DACH
Christian Storey – Regional Director, LATAM

Expand Networks is the pioneer and leader in the WAN Optimization market and is positioned by Gartner Inc. in the 'Leaders' quadrant in its Magic Quadrant for WAN Optimization Controllers, 2009[i]. Expand helps organizations simplify their IT infrastructure while delivering remote offices fast, reliable and secure access to networked applications. This results in improved user productivity and cost-effective IT management. Expand offers a multi-service integrated platform that ensures superior performance for any application over any network. From its headquarters in Roseland, NJ and its global locations, Expand Networks serves more than 3,500 enterprise and public sector customers in over 100 countries including: American Express, Bacardi USA, BMW, Continental Airlines, Carr America, Colgate, Elizabeth Arden, Reed Exhibitions, Target and United States Department of Defense with over 40,000 units deployed and over 2,000 MS Terminal Services and Citrix customers. Expand is the largest supplier of WAN Optimization products to US Government and Military agencies (greater than 10,000 units) and also has the most units deployed at a single corporate location, (4,500).

Expand Networks, Accelerator, Expand Compass, ExpandView are trademarks of Expand Networks. All other trademarks are the property of their respective owners.

19 January 2011

Cloud Computing - What is its Potential Value for Your Company?

Cloud Computing - What is its Potential Value for Your Company?

Examining whether cloud computing makes good business sense for your company. In essence, cloud computing means running software and accessing data that reside somewhere else. ZDNet explains (Hinchcliffe, 2008) cloud computing in business-trend terms: "Software platforms are moving from their traditional centricity around individually owned and managed computing resources and up into the 'cloud' of the Internet."

eBook-Moving to the Cloud

There's a smarter, secure, collaborative way to work. Cloud-based messaging and collaboration apps help businesses increase productivity while simplifying IT and reducing costs.

When it comes to cloud computing, some companies have concerns around security and a perceived loss of control. But others are grabbing the opportunity to move to a more collaborative, Web-based platform. In this E-Book, we take a look at several companies that have migrated to Google Apps. Here, their IT leaders present the results they have achieved so far, as well as they lessons they have learned.

Cloud Computing- Latest Buzzword or a Glimpse of the Future?

Cloud computing generates as much excitement today as it does controversy. Go beyond all the hype and buzzwords and decide for yourself whether cloud computing is truly a game changer or just another technology fad. Learn about its benefits—and potential pitfalls—and why so many of today's leading companies already embrace this emerging technology.

Meraki Releases Cloud-managed Router, Introduces NaaS Program for Cloud-managed Networks

San Francisco-based cloud networking company Meraki unveils the Meraki MX series cloud-managed routers, the company said in a press release. The new models MX50 and MX70 feature cloud-based centralized management, a firewall, internet gateway service, and application traffic shaping. The new routers are a step forward toward an easy-to-use cloud-based environment providing an intuitive browser-based user interface, which does not require trained specialists to operate the system The MX series is designed to offer automatic security, signature and feature upgrades as well as network-wide monitoring.

Hans Robertson, VP Product Management and Co-Founder at Meraki.

The key features of the MX series include cloud-based centralized management, layer 7 application firewall and traffic shaper, site to site VPN, routing, DHCP, and a firewall. Additionally, the new hardware devices are able to automatically detect and monitor printers while tracking and showing printer ink levels across remote divisions. At present, Meraki's cloud management system is implemented in over 17,000 networks worldwide and the new devices are based on the same system. Therefore, MX series routers can be installed in single standalone networks and in large distributed networks alike.

The MX series comes in two editions: Enterprise Edition and Advanced Security Edition. The first one provides cloud-based centralized management, routing, and application traffic shaping. The Advanced SecurityEdition provides additional functionality like site-to-site VPN and next-generation firewall capabilities.

Meanwhile, the company announced the introduction of Networking as a Service, a new pricing model for cloud-managed network infrastructure products. The company offers the NaaS pricing model for all its wired and wireless products.

"Meraki's Networking as a Service program provides a cost-effective and convenient way for organizations to obtain a Meraki system on a 'pay as you go' model. Networking as a Service eliminates upfront capital expense, gives you the option to upgrade your hardware for no cost at any time, and removes the financial risk of owning too much infrastructure," Hans Robertson, VP of Product Management and Co-Founder at Meraki, said in a company press release.

The MX series routers are also available via Meraki's new NaaS program, and customers can purchase devices through an annual subscription. The company claims that the all-inclusive price includes the cost of the hardware, software licenses, ongoing, upgrades, maintenance, and support provided by Meraki.

Inevitably, the news will boost company's financial performance although Meraki boasts partial funding by Google and Sequoia Capital. In January 2008, the company received USD 20 million in series B funding. Other investors in Meraki, include DAG Ventures and Northgate Capital.

Meraki started as an MIT research project in 2006 and relocated to California once the project turned commercial. Its core business is related to providing wireless networks controlled in the cloud and has 17,000 networks deployed worldwide, according to Meraki. The company offers various products including its flagship Enterprise Cloud Controller, a product allowing wireless access to corporate LAN for offices, industrial firms, retailers, educational institutions, and multi-site locations.

18 January 2011

Next Generation Cloud Services Company Chooses Varidion QoS Solution

Cloud Vision
cloud computing Picture 1501 Next Generation Cloud Services Company Chooses Varidion QoS Solution

In the competitive world of offering Cloud Computing services, one way to differentiate is to offer comprehensive Quality of Service guarantees. This has lead Varidion, the cloud communications company to select Highlight from NetEvidence to give its customers full visibility into the quality and performance of cloud-based applications. 

Varidion is a next generation Service Provider that delivers Business Applications, Managed Security and Cloud Data Centers to medium and large enterprises within a single guaranteed service. Its clients such as Maybourne Hotel Group (Claridge's and the Berkeley) and Reed Recruitment can now benefit from Highlight's full range of network and application monitoring capabilities.

Neil Camden, Technical Director at Varidion says, "Our customers are asking us to report on the quality and performance of cloud-based applications that are outside their network control. With Highlight from NetEvidence, we can now deliver this level of service. For example, when our clients use Salesforce.com, Google Apps or Microsoft Cloud, they have no way to report on the usability of these applications. But with Highlight, we can give them meaningful application performance data that allows them to identify issues or failures even though the applications are cloud-based and they don't own or control them."

Neil adds, "As a cloud-based service itself, Highlight is the perfect fit with our own cloud ecosystem. It fits the bill for service reporting, monitoring and alerting that is both comprehensive but easy to understand. NetEvidence is also a highly flexible organisation and its team is working to integrate Highlight with our own CRM solution to deliver an end to end solution. We particularly admired NetEvidence's open approach to service management."

Varidion's services are based on flexibility, enabling customers to change and adapt infrastructures to ever shifting business dynamics… without penalty. The company's culture and philosophy is in stark contrast to large incumbent network operators or old style network integrators that are reliant on fast ageing technologies and/or rigid business models.

Richard Thomas, Managing Director of NetEvidence comments, "The team at Varidion is extremely knowledgeable whilst having a fresh approach to the market, which is based on simplicity, openness and trust. At NetEvidence, we have found that many enterprises just don't trust their application Service Providers since they have no visibility of the services they are getting. Varidion is definitely breaking the mould."

Salesforce.Com Buys Dimdim

IDG News Service — Salesforce.com announced Thursday it has purchased Web conferencing software vendor Dimdim for US$31 million, and plans to use the company's technology to augment its Chatter enterprise collaboration platform.

Dimdim provides a suite of messaging, screen-sharing and other capabilities. Its addition to Chatter will mimic "the proven Facebook model of combining collaboration and communication into an integrated service," Salesforce.com said.

Some 60,000 customers already use Chatter, which Salesforce.com first introduced in 2009. The Dimdim deal should spur even healthier adoption, Salesforce.com said in a statement.

Monthly Dimdim accounts will be available until March 15, while annual accounts will be valid until the ending subscription date, according to a FAQ document on Dimdim's website. Any recordings or uploaded documents will not be available after subscriptions expire and customers are encouraged to download them ahead of time, it adds.

Open-source code made available by Dimdim will still be accessible at SourceForge.net, but the company will no longer be contributing to that project, according to the FAQ.

The deal, which had been rumored for some time, also represents the latest effort by Salesforce.com, still best known for its core CRM (customer relationship management) software, to round out a portfolio of business applications and services.

At its recent Dreamforce conference, Salesforce.com unveiled Database.com, which gives customers access to its platform's underlying database infrastructure.

Chris Kanaracus covers enterprise software and general technology breaking news forThe IDG News Service. Chris's e-mail address is Chris_Kanaracus@idg.com

17 January 2011

5 Most Surprising Things about the Cloud in 2010

CIO — 2010 was the year "cloud computing" became colloquialized to just "cloud," and everyone realized "cloud," "SAAS" and all the other xAAS's (PAAS, IAAS, DAAS) were all different implementations of the same idea — a set of computing services available online that can expand or contract according to need.

Not all the confusion has been cleared up, of course. But seeing specific services offered by Amazon, Microsoft (MSFT), Oracle (ORCL), Citrix, VMware (VMW) and a host of other companies gave many people in IT a more concrete idea of what "the cloud" actually is.

What were the five things even experienced IT managers learned about cloud computing during 2010 that weren't completely clear before? Here's my list.
1. "External" and "Internal" Clouds Aren't All That Different

At the beginning of 2010 the most common cloud question was whether clouds should be built inside the firewall or hired from outside.

Since the same corporate data and applications are involved — whether they live on servers inside the firewall, live in the cloud or burst out of the firewall into the cloud during periods of peak demand — the company owning the data faces the same risk.

So many more companies are building "hybrid" clouds than solely internal or external, according to Gartner virtualization guru Chris Wolf, that "hybrid" is becoming more the norm than either of the other two.

"With internal clouds you get a certain amount of benefit from resource sharing and efficiency, but you don't get the elasticity that's the real selling point for cloud," Wolf told CIO.com earlier this year.
2. What Are Clouds Made of? Other Clouds.

During 2010, many cloud computing companies downplayed the role of virtualization in cloud computing as a way of minimizing the impact of VMware's pitch for end-to-end cloud-computing vision -- in which enterprises build virtual-server infrastructures to support cloud-based resource-sharing and management inside the firewall, then expand outside.

Pure-play cloud providers, by contrast, offer applications, storage, compute power or other at-will increases in capacity through an Internet connection without requiring a virtual-server infrastructure inside the enterprise.

Both, by definition, are virtualized, analysts agree, not only because they satisfy a computer-scientific definition, but because they are almost always built on data-centers, hosted infrastructures, virtual-server-farms or even complete cloud services provided by other companies.
3. "Clouds" Don't Free IT from Nuts and Bolts

Cloud computing is supposed to abstract sophisticated IT services so far from the hardware and software running them that end users may not know who owns or maintains the servers on which their applications run.

Cloud Computing Used to Hack Wireless Passwords

PC World — German security researcher Thomas Roth has found an innovative use for cloud computing: cracking wireless networks that rely on pre-shared key passphrases, such as those found in homes and smaller businesses

Roth has created a program that runs on Amazon's Elastic Cloud Computing (EC2) system. It uses the massive computing power of EC2 to run through 400,000 possible passwords per second, a staggering amount, hitherto unheard of outside supercomputing circles--and very likely made possible because EC2 now allowsgraphics processing units (GPUs) to be used for computational tasks. Among other things, these are particularly suited to password cracking tasks.

In other words, this isn't a clever or elegant hack, and it doesn't rely on a flaw in wireless networking technology. Roth's software merely generates millions of passphrases, encrypts them, and sees if they allow access to the network.

However, employing the theoretically infinite resources of cloud computing to brute force a password is the clever part.

Purchasing the computers to run such a crack would cost tens of thousands of dollars, but Roth claims that a typical wireless password can be guessed by EC2 and his software in about six minutes. He proved this by hacking networks in the area where he lives. The type of EC2 computers used in the attack costs 28 cents per minute, so $1.68 is all it could take to lay open a wireless network.

Roth intends to make his software publicly available, and will soon present his research to the Black Hat conference in Washington, D.C.

Using EC2 for such ends would be against Amazon's terms of use, of course, butReuters quotes Amazon spokesman Drew Herdener as saying that if Roth's tool is used merely for testing purposes, everything's above board.

Roth's intention is to show that wireless computing that relies on the pre-shared key (WPA-PSK) system for protection is fundamentally insecure. The WPA-PSK system is typically used by home users and smaller businesses, which lack the resources to invest in the more secure but complicated 802.1X authentication server system.

WPA-PSK relies on administrators setting a passphrase of up to 63 characters (or 64 hexadecimal digits). Anybody with the passphrase can gain access to the network. The passphrase can include most ASCII characters, including spaces.

WPA-PSK is believed to be secure because the computing power needed to run through all the possibilities of passphrases is huge. Roth's conclusion is that cloud computing means that kind of computing power exists right now, at least for weak passwords, and is not even prohibitively inexpensive.

14 January 2011

Overcoming Cloud Security Issues

A number of recent surveys on cloud computing have shown that security in the cloud is the number one concern among organizations considering cloud adoption.

Large enterprises are still cautious about moving their applications and data to the cloud environment as many of them view it as a complete loss of control over security and data. There are many arguments within the IT community about whether the shortcomings of the cloud, with security being the most stated one, will outweigh the benefits or not. The obvious reality is that the cloud is still evolving, and solution providers battling for customer attention are developing and deploying more sophisticated security measures to ensure best security and privacy practices.

Some experts in the cloud domain even claim that the expertise of established giants of the cloud industry, such as Google, can provide higher security considering that they can afford to employ the best specialists in the field vs. a customer’s in-house security team. Wherever the truth lies, cloud customers, both SMBs and enterprises, should always demand transparency from a cloud vendor and receive detailed instructions on the security measures that they have established. They should also ensure that the cloud provider lists the guaranteed security controls in a SLA agreement.

Gartner has listed the following 7 as the top security risks that are a potential threat to companies moving to a cloud environment:

user access to data and information
compliance with regulations
location of the data
the encryption used at every level
recovery measures in the event of a security breach
investigative support
long-term viability of the agreement between the provider and the user.

The first question that comes to every cloud customer’s mind is—”Where dose my data reside?” This is a very legitimate question, as the location of the data center can greatly affect data security. Some cloud vendors have data centers offshore in countries with different privacy and security laws, meaning that the control over the user data may be exposed to third party, such as cloud administrators. Cloud customers should ask the solution provider about the location of their data centers and also about the security measures they execute in case of a security breach.

When considering a cloud vendor, companies have to make sure that the provider is aware of its duty to assist the customer in being compliant with governmental data security and privacy standards.
Another important thing to be sure of is that the cloud vendor uses encryption for securing data at rest and in transit. The cloud service provider should encrypt data on storage devices at all times in order to prevent data breaches. Companies have to make sure that their data is protected when transmitted over the Internet by always being encrypted and authenticated by the cloud provider.

BizCloud is dedicated to helping companies mitigate potential risks involved with cloud migration. Partnering with the leading cloud providers, BizCloud identifies the best providers or solutions for our clients. The right choice of cloud vendor depends on a thorough assessment of their offerings which is exactly what BizCloud’s cloud experts do for our clients.

Since the security of data is one of the most important considerations when choosing a cloud vendor, we decided to highlight OpSource and their “defense-in-depth” security strategy that makes them a leader in Enterprise-Class Security. Here is the short overview of security measures provided by OpSource Cloud Hosting.

OpSource Cloud Hosting provides the security and control that enterprises demand. Unlike other commodity cloud services, OpSource provides an environment to configure and lock-down your compute and storage environments. With Opsource Cloud Networks, customers are able to configure VLANs between servers, configure ACL-based firewalls, and control and track administrative usage. Data is encrypted while being transferred as well as at rest.

Rather than implementing their network security on top of their virtualized servers, OpSource Cloud Networks is a truly network-based implementation running within their Cisco switching fabric. Customers manage and configure OpSource Cloud Networks via the web-based OpSourceCloud.net user interface or Open API.

Role-based Administrative Control
VPN administration of all servers
Unique username and password for multiple administrators
Role-based permissions allow an administrator to limit to manage only certain resources, such as servers, storage or networks.
Audit logs of all environmental changes

OpSource maintains SAS-70 attestation in conjunction with their auditor SAS 70 Solutions. Their SAS-70 attestation is based on an in-depth series of documented controls covering the operational management of the OpSource Cloud Hosting infrastructure.

24/7 Incident Response
OpSource Security Incident Response Team to handle reports of security incidents. The OSIRT will escalate the incident to law enforcement and/or executive management as prescribed in security policies.


Join Us: http://bit.ly/joincloud

13 January 2011

Dell’s Deals in Cloud Computing

Dell may be known across the world for its computers, but its well on its way to emerging as a serious player in the cloud computing space, if its recent spate of acquisitions is any indication. Its recent announcement to acquire cloud-based medical archiving services provider InSite One, Inc is another step in that direction.

Dell is looking to diversify from its niche as a hardware player and has acquired several companies in its bid towards becoming a solutions company, with 2009’s acquisition of Perot Systems being the biggest buy at $3.9 billion. Its shopping bag this year has mainly included companies in the enterprise solutions and cloud computing areas. Here’s a representative list:

1. Planned acquisition of Compellent Technologies for $960 million.
2. November acquisition of Boomi.
3. July acquisitions of Scalent and Ocarena Networks.
4. February acquisitions of Kace Networks and Exanet.

“We will roll out a comprehensive array of solutions for large and medium enterprises to help them migrate to a cloud computing model, specifically to private or hybrid clouds,” Dell’s Singapore-based managing director and general manager for South Asia and Korea, Ng Tian Beng has been quoted as saying in an interview. “We’re transforming from a hardware-centric company to a solutions provider. We see tremendous opportunity in the cloud, specifically our new VIS (Virtual Integrated System) architecture and services to help customers transition to open, cloud-like delivery models.”

Most of the activity will be centered in Asia, and with good reason – revenues from the region had grown by 29% in the third quarter of 2010 even as the rest of the world struggled to get out of the recession. Mr. Beng confirmed this approach, saying,

“We’re excited about the potential that this region offers for our solutions and services business. The goal for many of our new solutions is to reduce customer data management costs by 50%, thus freeing up more funds for strategic IT activities. We will integrate the new technologies from our acquisitions and from our business partners.”

After all these developments in 2010, 2011 ought to see some big announcements from Dell in cloud computing. With Dell’s entry, there are several big players in this space. Microsoft, Google, IBM and Salesforce.com were some early starters, joined by Oracle this year and Dell soon to follow suit.


Join Us: http://bit.ly/joincloud

10 Steps To Mobile Worker Support

Today, millions of workers never set foot in offices, and the ability to work from anywhere is a powerful recruiting tool. Fortunately, supporting roving employees is easier than ever. Here's how.

The permanently mobile workforce is a powerful tool for reducing costs--from real estate and utilities to travel and equipment--while simultaneously boosting productivity and morale. Put in place a strong telework program, and even the smallest company has access to a multinational talent pool. Blizzards, flu pandemics, traffic gridlock, general pestilence? No problem.
But a productivity win for the business can be quite the opposite for IT teams suddenly facing increased security risks from employee use of uncontrolled public networks and the need to accelerate deployment of collaboration and social networking technologies to geographically dispersed workgroups. Planning for remote access and mobility requires a focus on network security, client management, and Internet-centric communications as well as policies that regulate a new work paradigm.

Related Articles

More Global CIO Insights

White Papers



Sponsored by: 
Liquid Image shows off its various ski, motorcycle, and diving goggles with an HD video camera built-in. The camera is top dead center above the nose and there are 720p and 1080p offerings (the latter being more expensive). They start at $250. At CES 2011, David Berlind caught Lenovo showing-off a hybrid tablet/notebook (the tablet snaps out of the clamshell which is more like a dock). At the flick of a switch, the tablet instantly changes from Windows 7 to Android (they run side-by-side). DriveNTalk's hands-free BHF 2000 car kit, mounts right on your car visor to enable hands-free mobile phone control via simple voice commands and hand gestures. At $99.99, it ll hit the market in February 2011 and will work with most popular phones.
At CES 2011, David Berlind caught Lenovo showing-off a hybrid tablet/notebook (the tablet snaps out of the clamshell which is more like a dock). At the flick of a switch, the tablet instantly changes from Windows 7 to Android (they run side-by-side).
One CPA firm that supports the federal government makes it clear to employees that mobility and security come with a convenience trade-off. "We use full-disk encryption at the BIOS level for all laptops, standard," a principal with the firm says. On customer sites, auditors employ Seagate BlackArmor NAS devices that provide advanced security capabilities. IT must approve any application installations on company gear. "We try to be flexible, but if IT thinks an application might pose a risk, the employee will need to justify why it's needed."
Setting ground rules up front is smart business, and not just around security. There are many elements to a successful remote worker program, not all of them technology related. We broke our 10 best practices down into hard and soft requirements--those focused on enabling technologies and those dealing with policies, management, and administration.


Join Us: http://bit.ly/joincloud

12 January 2011

How Business could have saved millions in the snow

How Business could have saved millions in the snow

As the infographic shows, the UK economy has been losing up to 1.2 billion a day during the current bad weather as up to 1/5th of the workforce has been unable to get to work. Over half of all these workers have no access to their work systems or the ability to work from home using systems such as Microsoft BPOS or Microsoft Office 365. The price of provisioning such systems has been prohibitive in the past but with a Cloud IT System the provision would be automatically available and with added communication facilities such as Web Conferencing meaning you can reschedule meetings online whilst trapped in your home.

It will be of no surprise to anyone to hear that the UK Economy has been hit hard by recent weather conditions. It is a trend that is developing during UK winters and the money that has been lost in the Economy is increasing. This is likely to be due to the increase in precipitation as reported over the past years by the Met Office and is projected to increase in the future.

So, it’s not the dropping temperature that is increasing the snow and ice problems but the increase in the amount of snow and ice we can expect when the conditions are right. Business owners and managers are aware this is a concern with over half of them stating lack of access to the workplace as a serious concern for their business – 73% of companies suffered staff shortages in November this year.

However, less than half have plans in place to keep their company up and running if they are hit by bad weather. With around 11% of businesses having to close at some point this year due to lack of access to the workplace, something needs to be done. Small business owners in particular need to make sure they are set up to deal with such weather conditions so they don’t join the estimated 2-3000 small businesses who have been pushed over the edge by the current climate.

The most annoying factor in all of this is that it need not cost a company any more money in order to protect themselves against being ‘snowed in’. A Cloud IT suite such as BPOS, ZOHO or Google Apps Premier would mean workers could access the exact same systems from home as they would at work.

Having your email Exchange in the cloud will allow you to up and downscale on demand as well as not having to call someone out to fix your server. To different extents they also have the ability to provision web meetings, conferences and presentations and the person you are ‘meeting’ will not need to have the same systems in place.

The mobile work force is on the increase worldwide and staff would have greater flexibility plus the opportunity to access information, and thus deal with issues, on demand. Regardless of the weather, if an employee doesn’t have access to the office system when they are travelling (or stuck) on a train for instance, then they are not as efficient a worker as they could be.

With an in house system, if power or the internet is lost from an office due to snow, wind, ice or flooding, then any link to your company email is gone. With a Cloud system this is not the case. The physical location of your office is not important, with 24hr cloud facilities there will always be someone taking care of your ‘sever’.

Perhaps the biggest bonus here though is the positive affect the Cloud can have on Global warming. By sharing computing power, rather than overprovisioning, we all help reduce CO2 emissions and thus are helping the plight that has lead to such increased snow and rainfall.


Join Us: http://bit.ly/joincloud

The Enterprise Cloud Revolution

Leading enterprises are revolutionizing the delivery of IT today. In his keynote at Cloud Expo Silicon Valley, Tony Bishop, CEO of Adaptivity, showcased multiple case studies and lessons learned of Global 2000 organizations that have radically changing the delivery of IT in their organizations by employing Cloud Utility IT models.

About the Speaker:
Tony Bishop is the Founder and CEO of Adaptivity. As Chairman and CEO, he leads the team and provides hands-on coaching, thought leadership and executive strategy support for the company's key clients and partners. He is an innovative IT executive, with an excellent track record in strategy, design, and the implementation of business-aligned enterprise technology platforms across large organizations.

A Rock Star Faculty, Top Keynotes, Sessions, and Top Delegates!
Cloud Expo 2011 New York, June 6-9, 2011, at the Javits Center, New York City, New York, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.

The growth and success of Cloud Computing will be on display at the upcoming Cloud Expo conferences and exhibitions in New York, June 6-9, and in Santa Clara, November 7-10, 2011.

The recent Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, was the largest Cloud Computing conference ever produced, more sponsors, exhibitors and delegates than all other Cloud events of the year combined!

The four-day New York event will attract more than 10,000 delegates from 48 countries and over 600 sponsors and exhibitors on a 120,000 sq ft show floor!

All main layers of the Cloud ecosystem will be represented in the 7th and 8th International Cloud Expo - the infrastructure players, the platform providers, and those offering applications, and they'll all be here to speak, sponsor, exhibit and network.

"Cloud Expo was announced on February 24, 2007, the day the term ‘cloud computing' was coined," said Fuat Kircaali, founder and chairman of SYS-CON Events, Inc. "Cloud has become synonymous with ‘computing' and ‘software' in two short years, and this event has become the new PC Expo, Comdex, and InternetWorld of our decade. By 2012, more than 50,000 delegates per year will be attending Cloud Expo."


Join Us: http://bit.ly/joincloud

11 January 2011

The Future of SaaS

Barcelona -- Welcome to a special BriefingsDirect podcast from the HP Software Universe 2010 Conference in Barcelona, an interview with Kevin Bury, Vice President and General Manager, and Neil Ashizawa, Manager of Products, both with HP Software as a Service.

We were at Software Universe in early December to explore some major enterprise software and solutions, trends and innovations, making newsacross HP’s ecosystem of customers, partners, and developers. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

This discussion, with two executives from HP, focuses on the software as a service (SaaS) market and how it and cloud computing are reshaping the future of IT.

Dana Gardner, Principal Analyst at Interarbor Solutions, moderated the discussion just after the roll-out of HP’s big application lifecycle management (ALM) news, the release of ALM 11.

Here are some excerpts on the future of SaaS discussion:
Bury: We are seeing a lot of interest in the market today for SaaS and cloud. I think it’s an extension of what we've seen over the last decade, of companies looking at ways that they can drive the most efficiency from their IT budgets. And as they are faced especially in these trying economics times of trying to do as much as they can, they're looking for ways to optimize on their investment.

When you look at what they are doing with SaaS, it gives them the ability to outsource applications, take advantage of the cloud, take advantage of web technologies to be able to deliver those software solutions to their customers or constituents inside of the business, and do it in a way where they can drive speed to value very, very quickly.

They can take advantage of getting more bang for their buck, because they don’t have to have their people focused on those initiatives internally and they're able to do it in a financial model that gives them tremendous value, because they can treat it as an operating expense as opposed to a capital expense. So, as we look to the interest of our customers, we're seeing a lot more interest in, "HP, help us understand what is available as a service."

Various components then include SaaS, infrastructure as a service (IaaS), certainly platform as a service (PaaS), with the ultimate goal of moving more and more into the cloud. SaaS is a stepping stone to get there, and today about half of all of the cloud types of solutions start with SaaS.

Where is this thing going? When is it going to end? Is it going to end? I don’t believe it is. I think it’s an ongoing continuum. It’s really an evolution of what services their constituents are trying to consume, and the business is responding by looking for different alternatives to provide those solutions.

For example, if you look at where SaaS got started, it got started because business departments were frustrated, because IT wasn’t responsive enough. They went off and they made decisions to start consuming application service provider (ASP) source solutions, and they implemented them very, very quickly. At first, IT was unaware of this.

Now, as IT has become more aware of this, they recognize that their business users are expecting more. So, they're saying, "Okay, we need to not only embrace it, but we need to bring it in-house, figure out how we can work with them to ensure that we are still driving standardization, and we're still addressing all of the compliance and security issues."

Corporate data is absolutely the most valuable asset that most companies have, and so they have seen now that they have to embrace it. But, as they look down the road, it moves from just SaaS into now looking at a hybrid model, where they're going to embrace IaaS and Platform as a Service, which really formed the foundation of what the cloud is and what we can see of it today. But, it will continue to evolve, mature, and offer new things that we don’t even know about yet.

Somewhere in between

Ashizawa: About a year, year-and-a-half ago, people were still trying to get their minds wrapped around this idea of cloud. We're at a stage now where a lot of organizations are actually adopting the cloud as a sourcing strategy or they are building other strategies to adopt it. We're probably past early adopter and more into mainstream. I anticipate it will continue to grow and gain momentum.

Now, IT is becoming much more involved. I would say that they are actually becoming more of a broker. Before, when it came to providing services to drive business, they were more focused on build. Now, with this cloud they're acting in a role as a broker, as Kevin said, so that they can build the business benefits of the cloud.

One of the key differentiators, as it’s evolved, in the way I see it, is really in the economic principles behind cloud versus managed service and ASP. With cloud, as Kevin mentioned earlier, you basically leverage your operation expense budgets and reduce that capitalization that typically you would still need to do in a historic ASP or managed service.

Cloud brings to the table a very compelling economic business model that is very important to large organizations.

Cloud brings to the table a very compelling economic business model that is very important to large organizations.

But if they are going to adopt the SaaS solution, that they vet out the integration possibilities -- to get out in front that. Also, integration doesn’t just stop at the technical level. There are also the business aspects of integration as well. You need to also make sure that the service levels are going to be what your business users' desire and that you can enforce, and also integration from the support model.

If the user needs help, what’s the escalation? What’s the communication point? Who is the person who is actually going to help them, given the fact that now there is a cloud vendor in the mix, as well as the cloud consumer.

Bury: Organizations can become overwhelmed by the promise and the hype of cloud and what it can offer. My recommendation is usually to start with something small. I go out and spend a lot of time talking to our customers and prospective customers. There are a couple of very common bits of feedback that I hear that CXOs are looking at, when they view where to start with a cloud or as a service type of initiative.

The first of these is, is it core to my business? If a business process is absolutely core to what they are doing, it’s probably not a great place to start. However, if it’s not core, if it’s something that is ancillary or complimentary to that, it’s something that may make some sense to look at outsourcing, or moving to the cloud.

The second is if it’s mission-critical or not. If it’s mission-critical and it’s core, that’s something you want to have your scarce resource, your very highly valued IT resources working on, because that’s what ultimately drives the business value of IT. Going back to what Neil said earlier, IT is becoming a broker. They only have so much bandwidth that they can deliver to those solutions and offerings to their customers. So, if it’s not core and it’s not critical, those are good candidates.

We recommend starting small. Certainly, IT needs to be very involved with that. Then, as you get more-and-more comfortable and you’re seeing more value, you can continue to expand. In addition, we see projects that make a lot of sense, things like testing as a service, where the IT organizations can leverage technology that’s available through their partners, but deliver via a cloud or a SaaS solution, as opposed to bringing it in-house.

Key opportunity for HP

We see SaaS as one of the key drivers, one of the strategic initiatives for HP to embrace. As I talk with my peers on the leadership team, we recognize SaaS as one of only two consumption model customers have for obtaining software from HP. In the traditional license play, they can consume the license and pay maintenance or, if they want to treat as an operating expense, it will be via the SaaS model.

As we look to what we need to do, we're investing very heavily in making all of our applications SaaS ready, so that customers can stand them up in their own data center and our data center or via a hybrid, where we may involve either a combination of those or even include a third-party.

For example, they may have a managed service provider that is providing some of the testing services. To your point earlier about the integration, HP, because of our breadth and our depth of our applications, can provide the ability to integrate that three-way type of solution whereas other companies don’t have that type of depth to be able to pull that off.

As SaaS now becomes much more mainstream and much more mature, big customers are now looking to companies like HP, because of the fact that we have the size, the depth, and the breadth of the solutions.


Join Us: http://bit.ly/joincloud