Friday, 29 March 2013

Australia a world leader in Cloud computing adoption: BSA - findings, cybersecurity, policy, BSA, security, Cloud, study, government, cloud computing, ranking - ARN

Hafizah Osman | 26 March, 2013 

Study shows Australia ranks second in Cloud computing adoption, behind Japan

Australia is one of the world leaders in Cloud computing policies, maintaining its second place ranking just behind Japan in the global Cloud computing standings, a new study by the Software Alliance (BSA) has found.

The 2013 BSA Global Cloud Computing Scorecard, which analysed the shifting global policy landscape for Cloud computing, evaluated 24 economies globally in seven policy areas critical to the market for Cloud computing services.

The study appraised Australia for its up-to-date cybercrime regime, bolstered by its sanction of the convention on cybercrime.

It also was recommended for its international harmonisation of rules – its security ranking received a boost when the government dropped plans for mandatory Internet filtering.

The Federal Government also most recently announced a new cyber security centre in Canberra and an additional $1.46 billion in funding for cyber security as part of a new national security blueprint.

BSA Asia-Pacific senior director for government relations and policy, Roger Somerville, said although Australia has improved on many areas in the scorecard scale by adopting and enhancing policies that are conductive to Cloud innovation, there remains room for improvement.

“Every country’s policies affect the global Cloud marketplace, so it is imperative for Australia to continue to focus on improvements.

“We encourage the Australian government to continue to commit to public sector Cloud use and adoption, similar to the US government’s approach of adopting a Cloud first policy. This will help Australia maintain, if not improve, on its ranking and help grow the global Cloud,” he said.

The study also found that the sharp divide between advanced economies and the developing world, revealed in the 2012 scorecard, has narrowed this year.

Somerville claimed it is a result of significant progress made by some developing countries, many of which are in the Asia-Pacific, along with the stable progress in major developed countries.

Japan, Australia, US Top Nations for Cloud: Study

By Robert Lemos  |  Posted 2013-03-07

A survey of the security and data-privacy policies for cloud computing found that Singapore, Canada and Russia made the greatest leaps forward for secure, open policy environments.

Nations around the world have made spotty progress on national policies to support cloud computing and digital commerce, with a few standouts, such as Singapore, passing solid privacy and security regulations, according to a survey of 24 countries by the Business Software Alliance.

In the "2013 BSA Global Cloud Computing Scorecard," Asian nations ran the gamut of marks for policy. Japan topped the list of nations with comprehensive laws supporting cloud computing and digital commerce, while Vietnam brought up the bottom due to a lack of regulations. Australia and the United States claimed the No. 2 and No. 3 spots, respectively, while Singapore's strong privacy and security regulations helped it vault to fifth place from number 10 last year.

"We see some really patchy progress around the world," said Chris Hopfensperger, technology policy counsel for the BSA. "Countries like Singapore have embraced a future of wanting to be a digital hub, [while] in Europe, we have seen real stalling across the board."

The survey measures how friendly nations' policies are to cloud computing and digital commerce, taking into account factors such as whether there are laws dealing with privacy, security, cyber-crime and intellectual property. Because the Business Software Alliance has historically been most interested in protecting its members’ software from piracy, the report gives appropriate intellectual property regulations the most weight among those four issues. The largest factor, however, was the readiness of a country's infrastructure to handle digital commerce.

The top-five nations were Japan, Australia, the United States, Germany and Singapore, while the bottom five were Vietnam, Thailand, Brazil, Indonesia and South Africa. Singapore, Canada, Russia, India, China and Brazil all moved up at least two positions in the rankings. Although it didn't move up in rankings, Malaysia added the most points to its score by improving its cyber-crime and intellectual-property laws, Hopfensperger said.

"Malaysia crossed the digital divide from a developing market to a digital market," he said. "We hope for more of that."
Singapore, the biggest gainer in rankings, passed a new privacy regime last year that borrows from the European Union model as well as the Asia-Pacific Economic Cooperation (APEC) privacy framework. The law balances the obvious need for personal data protection with the ability of companies to move data through the cloud to support digital commerce, Hopfensperger said.

In the United States, despite fears that some companies and individuals have that the government could sift through their cloud data, the reality is that the policies are quite supportive of privacy and security, said Hopfensperger.

"There are really strong protections in U.S. law to prevent the government from lifting the hood and digging around in your data," he said. "They have to have pretty good reasons to do so. There are a lot of barriers that get in the way.

Asia looks to cloud computing -

Author: Clive Davidson
Source: Asia Risk | 19 Dec 2012

Rapid technological advances mean that financial institutions in Asia are looking to cloud computing solutions as an alternative to investing in internal infrastructure. But regulators around the region are cautious about the potential pitfalls involved

The nature of microprocessor development is such that computer capabilities have doubled roughly every 18 months ever since microprocessors were invented more than 50 years ago. In addition to the improvement in individual chips, technologists have devised clever ways to organise them to give even more performance, such as parallel processing architectures and computer grids. The latest innovation is cloud computing: the creation of huge pools of processing and data management resources and making them available online and on demand.

Cloud computing promises enormous performance advances and cost efficiencies, and the transfer of all the headaches of systems implementation, maintenance and upgrades away from user organisations to cloud service providers. These are precisely the issues that many financial institutions are struggling with. Banks, hedge funds, asset managers, insurers and other organisations require ever greater computational power for applications such as counterparty credit risk analysis and economic capital calculation, while at the same time budgets are under intense pressure. As a result, keeping on top of technology advances is becoming increasingly challenging.

Given these challenges, a number of banks in the region are pioneering the use of cloud to support their operations, with Commonwealth Bank of Australia (CBA) and National Bank of Australia (NAB) leading the way. Some hedge funds and other buy side firms are following suit, finding in cloud a way of accessing powerful complex technology they would not be able to afford or manage in-house themselves. Cloud can even offer developing countries a means of leapfrogging the technology development process and taking immediate advantage of advance computing capabilities to create their market infrastructure.

Other institutions are more cautious, raising issues about data security, application control and systems reliability. Regulators in a number of jurisdictions, including Australia, Hong Kong and Singapore, have also expressed concern, and have issued warnings to institutions to look before they leap into the computing cloud.

The key concept of cloud is virtualisation. In traditional computing, an application – such as a value-at-risk (VaR) calculator – sits on a particular machine with its own processor, operating system, memory and storage. With cloud, all these elements are collected in a pool of resources and a ‘virtual’ machine is created that is tailored to the requirements of each user request and only for a long as needed, after which the processors, memory and other elements return to the pool.

In addition to providing virtual machines for specific applications, clouds can also be used to provide more generalised computing facilities, such as IT infrastructure or platforms (virtual machines that run a number of applications). Cloud resources are generally located in dedicated data centres and can operate as private clouds (proprietary to an organisation), or community (shared by a number of organisations), public (open to all) or hybrid clouds (such as private-community).

NAB is a third of the way through a multi-year multi-billion dollar technology transformation programme that includes establishing a substantial private cloud for IT infrastructure, platforms and applications. “Our approach is long term because in the past short-term decisions got banks into the current situation of having to deal with a mix of old and new technologies,” says Denis McGee, chief technology officer. NAB is developing its data centres in partnership with IBM, a leading cloud services provider. Among the benefits of the cloud approach is the reduction in total power requirements for hardware – now a significant cost for any major IT set up – as well as the ability to keep the technology independently up to date of the applications, and the speed and simplicity of providing virtual – as opposed to real – machines for new applications, says McGee. This latter aspect is particularly important in areas where time to market can be critical, such as trading and risk management.

CBA, meanwhile, says that moving to a cloud infrastructure enables it to get a virtual machine up and running for a new application in 90 minutes at a cost of just $150, compared with doing it internally in three weeks at a cost of $10,000. The bank, which spends nearly $1 billion annually on IT, has been moving applications into a private cloud for the past four years and says it has already saved tens of millions of dollars, and expects to save hundreds of millions more in the future by using cloud. Areas where cloud has been particularly effective so far are data storage and application testing and development, where the bank has been able to halve costs.

Last year, Westpac, in conjunction with Microsoft and New York-based pricing and analytics vendor Numerix, ran a pilot project to demonstrate how computationally intensive pricing calculations using Numerix software could be uploaded into Microsoft’s Windows Azure cloud. The pilot showed that what took more than five hours to compute on an in-house machine took less than 30 minutes in the Windows Azure cloud.

The elastic effect
The use of cloud to provide this type of ‘elasticity’ in processing power for ad hoc computational demands – either in short bursts for pricing instruments or longer stretches such as running periodic stress tests – is one of the key areas of interest for the financial industry, says Seattle-based Rupesh Khendry, industry solutions director for capital markets for Microsoft. Other major cloud opportunities include storing the huge volumes of data they now regularly generate, and becoming development and test environments for new applications.

In the latter case, a bank might want significant resources to test a major new business programme, but only need it for six months. Its next development project, however, may require a very different configuration of resources. In this case, cloud can offer a far more efficient and cost-effective way of meeting these demands than conventional in-house architectures, says Khendry.

A number of other risk technology vendors besides Numerix are looking at cloud as a way of providing turbo boosts for calculations. For example, New York-based Risk Integrated has adapted its Specialized Finance System (SFS) risk management and reporting system for commercial real estate and project finance to run in the cloud. It has established a private-community cloud, as well as developed a facility called SFS Cloud-burst that can reach out into a public clouds, such as those provided by Microsoft, Google or IBM, for extra resources for high-intensity portfolio modelling.

Banks with in-house versions of the SFS typically install the system on 10-40 servers with internal infrastructure costs of hundreds of thousands of dollars, says Yusuf Jafry, chief technology officer (CTO) of Risk Integrated. SFS cloud-burst enables users to call up several hundred servers for a modelling session, paying only for the time they use them. Buying this amount of extra servers – that would sit idle for much of the time – is out of the question for banks. Meanwhile, having the performance elasticity of cloud means banks can run more comprehensive and detailed risk models, says Jafry. Although only recently introduced, the cloud-burst and cloud service versions of SFS are already being used by clients; in addition, Risk Integrated is in negotiation with an Asia-based institution to provide it with the SFS cloud service version.

One of the most successful commercial uses of cloud so far is by Apple to support its iPhone, iPad and App Store. By opening its iPhone and iPad platforms to third-party developers, Apple unleashed a tidal wave of creativity that extended the use of its phone for a myriad of business and leisure purposes, including a host of financial apps. New York-based Imagine Software has taken the platform and app concept and applied it to its Imagine Trading System. The company was already offering the portfolio and risk management system as a hosted service.

“We were sitting on all this technological capability and data [in the Imagine Trading System] but its potential was bottlenecked by the ideas and resources we had to tap into it,” says Steven Harrison, president and chief operating officer of Imagine. To overcome this bottleneck, the company introduced an open platform version of its software in July called Imagine Financial Platform (IFP) and the Imagine Marketplace app store. “It’s like we just hired thousands of new programmers around the world, each with their own motivations and expertise to build on top of our platform,” says Harrison.

One of the first to adopt the new Imagine apps is Hong Kong-based hedge fund Nine Masts Capital. It has used Imagine and third-party developers to help it create a number of apps for pre-trade compliance, management of borrowing and improvements in reporting. “As an adviser to funds and trading in a global market, there are a wide number of regulatory changes that require enhanced reporting and complex risk management,” says Elaine Davis, chief operating officer at Nine Masts. “The new IFP has significantly decreased the time it takes for us to get them running.”

The firm already has apps for Hong Kong short reporting and the US Form PF for hedge fund reporting. “We’ve also been able to use the system to create reports quickly for investors. As investors become more savvy in terms of risk, they have their own unique requirements in terms of how they like to receive data,” says Davis. In other words, she says, Imagine apps and the IFP provide a quick, low-cost way of producing the reports.

Warning signs
While Nine Masts, Imagine, Risk Integrated and the Australian banks are among those companies already demonstrating the potential of cloud in financial services, the technology is not without its issues –  and there a number of voices in the market that are urging caution in its deployment. Neil Bartlett, CTO at Toronto-based risk management system provider IBM Risk Analytics, points out that many mature conventional risk systems at banks have been optimised at every level to meet the intense computational demands of pricing and risk analytics.

For example, code is optimised for speed of execution, memory is fine-tuned and data is stored in ways that make it readily accessible during processing. Furthermore, highly efficient parallel processing architectures have been installed to exploit the inherently parallel nature of many pricing and risk applications, where the same process is often repeated many times with changing variables – such as in Monte Carlo simulation.

Because it virtualises hardware resources, cloud undermines this optimisation and can impose a performance penalty, says Bartlett. Risk analytics plucked out from an optimised environment and placed into the cloud today can suffer over a two times performance loss. But as Bartlett also points out, just 12 months ago the performance penalty was more than 20 times, which gives an indication of cloud technology’s rapid evolution.

Meanwhile, there are areas where the performance penalty can be acceptable, such as in the pre-production and testing phase of systems. “Pre-production is an emerging case for cloud because you don’t want to have large hardware resources dedicated to the process, although you still want to do some sizeable tests from time to time,” says Bartlett. It is worth nothing that CBA is a pioneer in this area and is already halving its IT costs by using cloud.

Other voices of concern come from the regulators. The Australian Prudential Regulation Authority (APRA) was among the first to raise concerns. In November 2010 it wrote to banks and insurers warning them that it viewed cloud computing as a form of outsourcing and that any approach must be “subjected to the usual rigour of existing outsourcing and risk management frameworks”, with board and senior management fully informed and engaged. Its key concerns were the potential compromise of a financial institution’s ability to continue operations and meet core obligations following a loss of cloud computing services. Confidentiality and the integrity of sensitive data (including customer information), and compliance with legislative and prudential requirements were other comcerns.

In July 2011, Wan Aik Chye, director and head of the specialist risk department at the Monetary Authority of Singapore (MAS), wrote in similar vein to all the financial institutions MAS supervises, reminding them of their responsibilities with regards to outsourcing – including the use of cloud computing. “In the case of cloud computing, financial institutions should be aware of unique attributes and risks, especially in the areas of data integrity, recoverability and confidentiality as well as legal issues such as regulatory compliance and auditing,” wrote Chye. “In particular, as cloud computing service providers typically process data for multiple customers, financial institutions should pay attention to the service providers’ ability to isolate and clearly identify their customer data and other information system assets for protection.”

Meanwhile, the Hong Kong Monetary Authority told AsiaRisk that it took a similar view and that its existing policy manual on outsourcing applied equally to cloud computing, particularly with respect to data ownership and protection.

Cloud advocates such as CBA and NAB say that although cloud presents some particular issues, many banks throughout the region have all kinds of outsourcing arrangements already in place and accepted by regulators, and that cloud is not very different in terms of compliance.

In addition, cloud service providers say they can help institutions overcome many of these concerns. Microsoft, for example, has data centres in most of the major jurisdictions in the region, which means it can help banks comply with the need to keep client data within local borders, says Khendry. Furthermore, the level of data security at the cloud providers’ data centres is often higher than institutions are able to implement within their own infrastructures.

This is certainly true for smaller, less technologically sophisticated firms. Davis points out that Imagine – with its dedicated data centres that serve a global community of users – is significantly bigger than Nine Masts. “We probably have a bigger risk that our facilities will become compromised than Imagine does,” she says. Furthermore, because cloud exploits the inherent resilience of the internet, it offers multiple access routes and high reliability. This was borne out in October when Hurricane Sandy struck the East Coast of the US where Imagine is based. “We had no availability issues at all with the Imagine service during the period of stress,” says Davis.

Imagine, as a major software vendor, may be a lot bigger than a hedge fund such as Nine Masts, but as Lehman Brothers proved, no financial organisation is too big to fail any more. Davis says that she is  less worried about issues around cloud computing than she is about the possibility that Imagine might go out of business. In that case, what would happen to Nine Masts’ data and apps? Would the firm be able to simply move them to another cloud services provider? This is APRA’s concern about the potential compromise of a financial institution’s ability to continue operations and meet core obligations following a loss of cloud computing services. It would also mean that cloud providers would have to ‘interoperate’, which in turn requires common technological standards.

The migration season
To address these and other concerns about cloud, a consortium of banks, corporates and other major IT users formed the Open Data Center Alliance (ODCA) in 2010. The Alliance’s steering committee includes NAB, Deutsche Bank, JP Morgan Chase and UBS. The ODCA’s aim is “to speed the migration to cloud computing” through the development of cloud computing standards, thereby avoiding vendor lock-in by encouraging interoperability between cloud providers.

One of the ODCA’s initiatives is to create common standard commercial frameworks for the implementation of cloud in individual industries that incorporate their specific regulations and laws. The ODCA is using the International Swaps and Derivatives Association’s Master Agreement model for derivatives transactions. The organisation hopes its frameworks will play a similar role in the growth of cloud computing as the Isda Master Agreements played in the growth of the OTC markets.

The use of cloud is likely to grow significantly in the coming years across the region. IT companies such as Microsoft and IBM are investing huge sums in the underlying technology, while many trading and risk system vendors besides Numerix, Risk Integrated and Imagine are enabling their systems to run in clouds.

In 2011, the Chinese government identified cloud as a strategic emerging industry and is supporting a number of pilot cloud projects – particularly in the financial sector. In addition, Japan, South Korea and India each have projects exploring cloud as a mechanism for delivering banking technology infrastructure services. Meanwhile, Daiwa Institute of Research, NTT Data and Fujitsu have been hired to design a new technology infrastructure for Myanmar and a source has told AsiaRisk that they are proposing to use cloud to help deliver it.

It may take a few more years, but there is no doubt that a strong cloud is settling in across the region’s financial sector.

Why is Big Data Revolutionary? | ZDNet

Andrew Brust
By for Big on Data |

Last week, Dan Kusnetzky and I participated in a ZDNet Great Debate titled “Big Data: Revolution or evolution?”  As you might expect, I advocated for the “revolution” position.  The fact is I probably could have argued either side, as sometimes I view Big Data products and technologies as BI (business intelligence) in we-can-connect-to-Hadoop-too clothing.

But in the end, I really do see Big Data as different and significantly so.  And the debate really helped me articulate my position, even to myself.  So I present here an abridged version of my debate assertions and rebuttals.

Big Data’s manifesto: don’t be afraid

Big Data is unmistakably revolutionary. For the first time in the technology world, we’re thinking about how to collect more data and analyze it, instead of how to reduce data and archive what’s left. We’re no longer intimidated by data volumes; now we seek out extra data to help us gain even further insight into our businesses, our governments, and our society.

The advent of distributed processing over clusters of commodity servers and disks is a big part of what’s driving this, but so too is the low and falling price of storage. While the technology, and indeed the need, to collect, process and analyze Big Data, has been with us for quite some time, doing so hasn’t been efficient or economical until recently. And therein lies the revolution: everything we always wanted to know about our data but were afraid to ask. Now we don’t have to be afraid.

A Big Data definition

My primary definition of Big Data is the area of tech concerned with procurement and analysis of very granular, event-driven data. That involves Internet-derived data that scales well beyond Web site analytics, as well as sensor data, much of which we’ve thrown away until recently. Data that used to be cast off as exhaust is now the fuel for deeper understanding about operations, customer interactions and natural phenomena. To me, that’s the Big Data standard.

Event-driven data sets are too big for transactional database systems to handle efficiently. Big Data technologies like Hadoop, complex event processing (CEP) and massively parallel processing (MPP) systems are built for these workloads. Transactional systems will improve, but there will always be a threshold beyond which they were not designed to be used.

2012: Year of Big Data?

Big Data is becoming mainstream…it’s moving from specialized use in science and tech companies to Enterprise IT applications. That has major implications, as mainstream IT standards for tooling, usability and ease of setup are higher than in scientific and tech company circles. That’s why we’re seeing companies like Microsoft get into the game with cloud-based implementations of Big Data technology that can be requested and configured from a Web browser.

The quest to make Big Data more Enterprise-friendly should result in the refinement of the technology and lowering the costs of operating it. Right now, Big Data tools have a lot of rough edges and require expensive, highly-specialized technologists to implement and operate them. That is changing though, which is further proof of its revolutionary quality.

Spreadmarts aren't Big Data, but they have a role

Is Big Data any different from the spreadsheet models and number crunching we’ve grown accustomed to? What the spreadsheet jocks have been doing can legitimately be called analytics, but certainly not Big Data, as Excel just can't accommodate Big Data sets as defined earlier. It wasn't until 2007 that Excel could even handle more than 16,384 rows per spreadsheet. It can't handle larger operational data loads, much less Big Data loads.

But the results of Big Data analyses can be further crunched and explored in Excel. In fact, Microsoft has developed an add-in that connects Excel to Hive, the relational/data warehouse interface to Hadoop, the emblematic Big Data technology. Think of Big Data work as coarse editing and Excel-based analysis as post-production.

The fact that BI and DW are complementary to Big Data is a good thing. Big Data lets older, conventional technologies provide insights on data sets that cover a much wider scope of operations and interactions than they could before. The fact that we can continue to use familiar tools in completely new contexts makes the something seemingly impossible suddenly become accessible, even casual. That is revolutionary.

Natural language processing and Big Data

There are solutions for carrying out Natural Language Processing (NLP) with Hadoop (and thus Big Data). One involves taking the Python programming language and a set of libraries called NTLK (Natural Language ToolKit)   Another example is Apple’s Siri technology on the iPhone. Users simply talk to Siri to get answers from a huge array of domain expertise.

Sometimes Siri works remarkably well; at other times it’s a bit klunky. Interestingly, Big Data technology itself will help to improve natural language technology as it will allow greater volumes of written works to be processed and algorithmically understood. So Big Data will help itself become easier to use.

Big Data specialists and developers: can they all get along?

We don't need to make this an either/or question. Just as there have long been developers and database specialists, there will continue to be call for those who build software and those who specialize in the procurement and analysis of data that software produces and consumes. The two are complementary.

But in my mind, people who develop strong competency in both will have very high value indeed. This will be especially true as most tech professionals seem to self-select as one or the other. I've never thought there was a strong justification for this, but I’ve long observed it as a trend in the industry. People who buck that trend will be rare, and thus in demand and very well-compensated.

The feds and Big Data?

The recent $200 million investment in Big Data announced by the U.S. Federal government received lots of coverage, but how important is it, really?  It has symbolic significance, but I also think it has flaws. $200 million is a relatively small amount of money, especially when split over numerous Federal agencies.

But when the administration speaks to the importance of harnessing Big Data in the work of the government and its importance to society, that tells you the technology has power and impact. The US Federal Government collects reams of data; the Obama administration makes it clear the data has huge latent value.

Big Data and BI are separate, but connected

Getting back to my introductory point, is Big Data just the next generation of BI?  Big Data is its own subcategory and will likely remain there. But it's part of the same food chain as BI and data warehousing and these categories will exist along a continuum less than they will as discrete and perfectly distinct fields.

That's exactly where things have stood for more than a decade with database administrators and modelers versus BI and data mining specialists. Some people do both, others specialize in on or the other. They're not mutually exclusive, nor is one merely a newer manifestation of the other.

And so it will be with Big Data: an area of data expertise with its own technologies, products and constructs, but with an affinity to other data-focused tech specializations. Connections exist throughout the tech industry and computer science, and yet distinctions are still legitimate, helpful and real.

Where does this leave us?

In the debate, we discussed a number of scenarios where Big Data ties into more established database, Data Warehouse, BI and analysis technologies. The tie-ins are numerous indeed, which may make Big Data’s advances seem merely incremental.  After all, if we can continue to use established tools, how can the change be "Big?"

But the revolution isn’t televised through these tools.  It’s happening away from them.

We're taking huge amounts of data, much of it unstructured, using cheap servers and disks.  And then we're on-boarding that sifted data into our traditional systems. We're answering new, bigger questions, and a lot of them.  We're using data we once threw away, because storage was too expensive, processing too slow and, going further back, broadband was too scarce. Now we're working with that data, in familiar ways -- with little re-tooling or disruption.  This is empowering and unprecedented, but at the same time, it feels intuitive.

That's revolutionary.

Big Data equals big business for Singapore | TODAYonline

Peter Yeo
14 March

SINGAPORE — Big Data is big business. This was shared by Managing Partner of Data Collective Matt Ocko, who spoke at the Vertex Innovation Forum 2013 yesterday.

Organised by Temasek-owned Vertex Venture Holdings, the forum detailed what Big Data is and why companies would benefit from jumping on this bandwagon. Data Collective is the world’s first Big Data-only early stage investment fund.
More information has been generated in the last handful of years than in all of human history. In the next two years, the immense volume of data will double.
Matt Ocko
Managing Partner, Data Collective

Mr Ocko defined Big Data with the three “V”s: Namely it is data that is too big (Volume), moves too fast (Velocity), and does not fit in the mould of existing data in storage (Variety or Variability). “Big Data is data that is growing so exponentially fast, and changing in its underlying content so fast, and it’s so varied in its origin and structure, that traditional IT systems and analytics and other processing systems are unable to cope with it.

“By that, I mean literally everything (computing systems) that was on the planet, up until a handful of years ago, was unable to cope with Big Data at the scale and speed that we’re talking about,” said Mr Ocko.

To give it perspective, Mr Ocko said more information has been generated in the last handful of years than in all of human history, by every single human being on the planet and by every adjunct, including computers, video cameras, etc. In the next two years, the immense volume of data will double again.

As a case sample, Mr Ocko said the Obama team put together an enterprise-grade infrastructure that processed 8.5 billion requests during the last United States Presidential election. This was done only 583 days before the election and the system was built from scratch and put together on-the-fly. It had processed, on average, 300 requests per second, 24 hours a day, across the entire course of the election. In many cases, there were 10,000 requests per second. This ad-hoc infrastructure, which was implemented largely in the Amazon Web Services platform, actually performed at a higher rate of reliability in speed than most major banks’ financial transaction processing systems.

It allowed the Obama administration to simulate the personal choice of 220 million US voters, hourly or better, and run their personal choices, including on a neighbourhood by neighbourhood basis, the entire US election 3,000 times per day. That insight allowed them to field volunteer time, advertisements, direct mail and email and tweets and Facebook postings in real time — down to an individual and neighbourhood by neighbourhood level — far better than the Romney team.

The mistake that the Romney team made was, while they knew this was important, they bought a bunch of very bloated software from some pretty old school guys, mostly their political cronies, and they built something very unwieldy that did not give them the real-time insights that the Obama team had.

The Obama team knew, based on their Big Data analytic, the morning of the election, that they were going to win. They literally all took the day off.

Essentially, Big Data will allow companies to sieve through data from customer feedback through social media or traditional survey forms to readings taken from backend systems such as the temperature gauge on a manufacturing machine, and collate them into information that companies can use to gauge customer demand and how their supply chain can meet the demand. Big Data management can take these data and translate them into real time reports that companies can react to: For example, a restaurant could gauge when a heat wave would begin and stock up on ice cream and beverages to meet demand.

But more than just businesses, Managing Partner, Data Collective Zachary Bogue said Big Data can also be used in agriculture or even healthcare. To prove his point, Mr Bogue showed his wristband that counts the number of steps that he’s walked, monitors his sleeping patterns and measures the intensity of his workouts. All the data collected can be used by medical professionals to mitigate health risks. “It becomes increasingly difficult for businesses, government, and even individuals to make ill-informed choices,” said Mr Ocko.

Singapore stands to benefit from the Big Data revolution with opportunity for telecom and datacentre revenue for countries that respect Intellectual Property and rule of law, said Mr Ocko.

“The fact is security in your software is only as good as the underlying software,” he said.

“If a rival company or a national intelligence agency is watching everything on the physical computer you’re running on, you’ve compromised your security. They know everything.
Global 2000 companies need a secure computing base to serve Asian customers, and Mr Ocko said his firm’s view is that “Singapore has a huge economic opportunity — the same way it’s transformed its banking industry from a regional powerhouse to a global powerhouse — to become the analog of a banking centre for secure computing”.

Thursday, 28 March 2013

Law to ban Google Glass on the road unlikely | ZDNet

By Charlie Osborne for Between the Lines | March 28, 2013 -- 09:33 GMT (15:03 IST)

A potential ban on Google Glass while on the road -- before the product has been released -- may have seemed premature, but legislation may already be stopped in its tracks.
Google_Glass_stickerA "Stop the Cyborgs" sticker on offer

CNET's Chris Matyszczyk wrote an article documenting the trend of preventing "cyber spying" entering the physical space even more than it already has -- with particular attention on the privacy concerns that Google Glass could bring into being.

One group, with a website called "Stop The Cyborgs," says that the product will prove to be the catalyst for a world where "privacy is impossible and corporate control total." Although many fight against technology that threatens to impede privacy, you could also argue with the widespread use of social networks including Facebook, GPS systems and our seemingly careless sharing of data, we may be in that kind of world already.

The look of the product aside, if someone is wearing a pair of the glasses, you can't know if you're being recorded or not. Perhaps it is something about being monitored obviously and in real-time which disquiets us, where is it in our field of vision rather than simply a security camera on the street or a photo take on a night out that we can happily ignore.

However, this isn't the only issue. If you'd like to wear your high-tech headgear on the road, fears that such technology may prove distracting have prompted a governmental response. Shortly after CNET's post went live, Republican legislator Gary. G. Howell, proposed a bill ahead of time to prevent wearable technology being legal to wear while you're in control of your vehicle. (Of course, using dashboard technology and apps isn't as distracting, is it?).

The bill, H.B. 3057, was designed to stop Google Glassers from wearing their headgear on roads in West Virginia. Although not against the invention itself, Howell said that it could be as distracting as texting -- and therefore could prompt a rise in accidents.

However, it is unlikely to pass this year.

Why? This week, the House Committee on Roads and Transportation sat and discussed the coverage of the bill, but did not discuss the bill in itself -- which means that barring a "special committee meeting" before Monday, the proposed legislation will be dead in the water -- at least until next year.

According to Howell, the general attitude on wearable technology and driving means that they "are going to have to look at the impact Google Glass and similar will have." We'll have to see what the next 12 months brings.

How Japan’s Financial Institutions Have Embraced Cloud Computing

March 12, 2013 | Asian MarketThinkCloud
In recent years we have observed Japan’s banks have embrace cloud computing. The period from 2010 to 2012 saw cloud computing in Japan shifts from a simple buzzword into an essential element of Japan’s financial services IT infrastructure. Today the real world applications of cloud computing in the financial services sector can be felt every day and are a fundamental element in the country’s banking industry.
The expanding use of the cloud in Japanese banks
This increased use of cloud technology in Japan’s financial institutions has led to real changes in IT infrastructure in Japan. Gradually more of the country’s infrastructure has shifted to the cloud, leading to a decreased use of hardware and an increased demand for SaaS and IaaS services. This supports the opinion of many global consultancy firms that predict that the spread of cloud technology throughout the world represents a turning point for many key industries. One of the industries most affected by this paradigm shift without a doubt is the financial services industry. This is because the latest trends in cloud computing technology point to potential advances that will have a lasting impact on Japan’s financial services market.
Common trends
After analyzing the adoption of cloud computing services by Japanese financial institutions and financial services providers, we have spotted several trends that can be useful in determining future paths of growth and expansion for cloud service providers. One of the most common concerns among Japanese financial service providers is the issue of standardization, particularly in the following areas:
- Standard systematized processes
- Standardized IT asset allocation
- Standardized IT asset governance
- Architecture
- Platforms
- Software configuration
- Software builds
- Standard operations
- Standard methodologies for maintenance
The need for standardization in all of these fields is a recurring theme in this market and one of the key steps for cloud service providers will be to come to terms with competitors about standards for the region. We have also observed that Japanese financial institutions have very high expectations for the potential of cloud computing to transform the following areas in their daily operations:
1. SaaS, regarding items and areas that can be commercialized. This includes payment systems and human resources operations.
2. Using cloud computing to procure the powerful processing power needed for temporary operations with a high demand for resources. For example, financial businesses can benefit from using the enormous computing power provided by the cloud to run Monte Carlo simulations.
3. Using the cloud to create shared data centers for smaller financial institutions in order to expand operations and reach a wider consumer base without having to invest in costly infrastructure and data center maintenance.
The concerns and expectations of cloud computing in Japanese banks
There is a high degree of interest in how cloud computing can help optimize financial services and global development. The main goal of adopting cloud technology is creating value, by permitting collaboration and establishing an agile platform for future development. This is an advantage of the technology that GMO Cloud constantly emphasizes. It is also clear that Japanese financial institutions now view many aspects of cloud computing as permanent fixtures in their business model, far from a passing fad.
There are several areas where cloud computing is already commonplace in these businesses. This includes using shared IT platforms for email, website and IT development, sales and customer support and management, and disaster recovery. Cloud computing is also used for data analysis and for the launch of new business operations and projects. This means that Japanese financial institutions are already committed to pursuing and implementing cloud computing in their operations.
This is especially important because one common misconception of cloud technology is that, like many previous technologies that have come and gone, it is somehow a temporary trend. This cannot be farther from the truth. As is evidenced by Japanese financial services providers, cloud computing represents a fundamental paradigm change in business. It is already used as part of core business operations and its use in this market sector is only likely to increase in the future. More importantly, the Japanese tech market has long had a history of being an early adopter of technology, meaning that this points to a worldwide trend in the next decade.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

About the Guest Author:
Nida Rasheed
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.

Wednesday, 27 March 2013

'Big Data' to drive analytics adoption in APAC | Enterprise Innovation

By Enterprise Innovation Editors | 2011-10-19Share

The phenomenon called "Big Data"—where the volume of data is expected to explode in the next few years—is expected to drive inestments into business analytics by firms in the Asia Pacific region to provide better insights into the data they have collected over the years, according to IDC.

Other aspects of "Big Data" are indeed new. The variety of the data sources is growing at a rapid rate, particularly as businesses move into the semi-structured and unstructured realm (e.g. social media interactions, rich media files and geospatial information). The other emerging factor that organizations need to contend with is the increased velocity at which data is being generated (e.g. real-time sensor data feeds from smart meters).

"These new aspects of 'Big Data' are creating unprecedented levels of complexity for IT executives, particularly as they realize that these massive data sets cannot be processed, managed and analyzed using traditional databases and architectures," says Philip Carter, Associate Vice President at IDC Asia/Pacific. "What is becoming clearer is that the real value from 'Big Data' will be derived from the high-end analytics, predominantly using data mining, statistics, optimization and forecasting type of capabilities to proactively turn this data into intelligence to drive business benefits and better decision making capabilities."

In line with this trend, as businesses in Asia invest to drive growth in emerging markets, they are harnessing analytics-led solutions to gain better customer insights, and manage risk and financial metrics more effectively while striving for unique market differentiation. In a February 2011 C-suite barometer survey drawing over 1000 responses from CIOs and LoBs across Asia/Pacific, IDC found that business analytics ranked as the top rated technology that would allow organizations to gain significant competitive advantage in the year ahead. In addition, in an IDC June 2011 survey of over 1300 CIOs and IT decision makers across Asia/Pacific (excluding Japan) or APEJ, data management and analytics ranked as the top business priority for organizations in the region.

However, the approach to business analytics in the era of "Big Data" will be significantly different to the traditional approach. “For example, one of the key differences between traditional analytics and what we are dealing with in terms of the 'Big Data' era is that we are gathering data that we may or may not need. From an analysis perspective, this means ‘we don’t know what we don’t know’. To run an analysis on 'Big Data', the variables and models are likely to be entirely new. Therefore, a different infrastructure strategy and perhaps most importantly, new skill sets, are required,” adds Philip.

To cope with the challenges "Big Data" poses, organizations must begin looking at deploying not only the applications traditionally used for Business Analytics (BA), but also the supporting architecture in order to scale efficiently. IDC recommends looking at cloud bursting, the deployment of analytical appliances, and creating truly scalable enterprise architectures that leverage the attributes of high performance computing. This approach should also allow for the deployment of new technologies and frameworks such as Hadoop to assist with the analysis of large pools of disparate, unstructured data.This will require new technical skill sets --particularly around emerging technologies like Hadoop, Map Reduce and Key Value Stores -- as well as a revised approach to the role of the business analyst. The next generation business analyst will be more akin to "data scientists". These individuals will have strong statistical skills and will be able to extract information from large datasets and present value to non-analytical experts. They will also have the unique skill of understanding the new algorithms and analytical models that will have the most significant business impact in the short term.

 In conjunction with the skills dimension, IDC believes that organizations need to be looking at their "Big Data" analytics strategy across the following dimensions:

    Technology identification/deployment
    Business case creation and ROI justification
    Data governance frameworks with clear policies and guidelines around master data management, data quality and data models
    Ensure IT/Business alignment by involving the critical stakeholders at the right time
    Involve the CIO as the supporter of the necessary transformation from an IT perspective that will in turn create the necessary business impact

As part this strategy, some organizations are putting in place a Business Analytics Competency Center (BACC). "This structure should include stakeholders from management, IT and the business to ensure that the projects undertaken get the right level of business alignment without impacting the data governance that IT needs to put in place. This is necessary because despite all the hype around 'Big Data', CIOs will need to be realistic about their approach to 'Big Data' analytics and focus on specific use cases where it will have the biggest business impact," recommends Philip.

Big Data Lives or Dies Based on Customer Data Management Strategy - The CIO Report - WSJ

By Noel Yuhanna and Mike Gualtieri

Digital technologies empower today’s customers, disrupt every industry, and cause your executives to question their competitive strategies. In the age of the customer, the only sustainable competitive advantage you can have is the degree to which you know and engage with your individual customers.

But despite the importance of this new technology, progress has been slow for most firms.

Traditional approaches are insufficient because they don’t deliver what today’s customer expects — engagement — and lack granular and contextual customer data. Executives don’t find actionable meaning in the data, and fail to use personal data in customer experiences. Some companies — think Netflix Inc. with its Cinematch recommendation engine and Wegman’s Food Markets with its in-store mobile app — are already excelling at creating the engaging, individualized experiences around your products and services that today’s customer demands.

The key to success is implementing a multidimensional view that helps individualize and contextualize customer experiences, deliver new customer insights, and create new opportunities for businesses to deliver differentiated experiences. This requires a new IT architecture that can support faster insights, process larger amounts of data more quickly, enable predictive analytics, and support the integration of information from both inside and outside your four walls. Forrester defines a multidimensional view of the customer as:

A view of the customer that uses all of the available information about them — including information pertaining to psychographics, behaviors, social networks, smart devices, geolocation, and Internet usage — to deliver individualized and contextual products, services, and experiences.

The four key technology components of such an architecture are Big Data, predictive analytics, in-memory technologies, and data virtualization. The business benefits of these four technologies help overcome the gaps and limitations of traditional data management platforms to support real-time data integration, exploit new data sources, and speed the generation of predictive customer insights.

Big Data is critical, but is only one piece of your customer data management solution. We often hear from clients that they need a “Big Data strategy”. What they really need is a holistic strategy that includes Big Data, predictive analytics, in-memory, and data virtualization. Here’s what you need to know about each of these technologies:

Big Data Is The Fuel That Drives Your Customer Experience Engine

You must consider every shred of customer data available for analysis, as it may contain gems that you can use to individualize experiences. Traditional data management solutions and approaches have difficulty consolidating and processing the array of large and unstructured  data sets that defines Big Data. To support a customer Big data platform, you need new technologies and architectures, including Hadoop, NoSQL databases, advanced enterprise data warehouses, and cloud analytic platforms.

Predictive Analytics Learns About The Individual Needs Of Your Customers

Predictive analytics uses machine-learning algorithms to dig deeper to find patterns that you can’t see using traditional BI tools. Big Data has breathed new life into predictive analytics, as more data can lead to better predictive models. Firms use these predictive models to anticipate what individual customers want, just as Pandora Media Inc.'s recommendations engine (a predictive model) provides personalized song playlists. Predictive analytics requires a breadth of tools and technologies to store, process, and access the volume, velocity, and variety of Big Data. Predictive analytics includes general-purpose Big Data predictive analytics solutions, industry- or domain-specific solutions, embedded solutions, database analytics, and consulting offerings.

In-Memory Data Solutions Delivers Value In Real-Time

Ever wonder why Google Inc. searches run so fast compared with accessing data in your business applications? Memory matters! Customer data stored and processed in memory helps create an opportunity to host predictive models and data needed upsell and cross-sell new products to a customer in real-time. Key technologies that can help deliver real-time customer experiences include in-memory platforms and event processing platforms.

Data Virtualization Is A Silo Crusher

Data silos are still a huge problem for most firms. Data virtualization integrates disparate customer data sources in real time or near real time to deliver a comprehensive multidimensional view of the customer to support personalization. Data virtualization can support all types of data: structured (text, relational data, and formatted data), semistructured (XML and files), and unstructured (emails, blogs, images, and video). It can integrate with Hadoop, NoSQL, and enterprise data warehouse platforms, and various on-premises sources such as packaged apps, custom apps, and mainframe apps. Data virtualization can also integrate with external sources: social platforms like Facebook Inc LinkedIn Corp. , and Twitter Inc.; software-as-a-service applications like those  from Inc. SAP AG ’s SuccessFactors, and SugarCRM; and marketplace data-as-a-service applications such as DataMarket Inc, Dun & Bradstreet  Inc., Factual Inc., and Microsoft Corp. ’s Windows Azure Marketplace.

Your customers increasingly expect and deserve to have a personal relationship with you and the hundreds of firms in their lives. Companies that continuously ratchet up personalization will succeed. Those that don’t will be increasingly become strangers to their customers. This sounds bad — but there is good news. The world is flush — and getting flusher — with Big Data from cloud, mobile, and the Internet of Things. Firms that invest heavily in a holistic approach will be primed to take advantage of its customer data and use it as a differentiator.

Noel Yuhanna and Mike Gualtieri are principal analysts at Forrester Research Inc.

Profits rise at cloud computing firm Iomart - BBC

Iomart, the Glasgow-based cloud computing firm, has forecast that its profits for 2012 are on course to come in ahead of market expectations.

In a statement to the London Stock Exchange, it said the year to the end of this month is likely to see pre-tax adjusted profit of around £10.6m.

That would be a rise from £6.9m in the previous year.

It said the firm's growth was coming from expansion of its existing base as well as from acquisitions.
It bought Manchester-based Melbourne Server Hosting in August 2012.

Directors say that pattern is expected to continue as consolidation in the cloud computing sector takes place.
Angus MacSween, the firm's chief executive, said: "Iomart continues to benefit from a compelling mix of a growing market, recurring revenues, sticky customers, good forward visibility and a leading competitive position.
"As a result we remain very confident of further growth in the next financial year and beyond."

Five myths of cloud computing

By: Bill Kleyman 
March 26th, 2013 
Technologies around the Internet and the WAN have been around for some time. However, it wasn’t until very recently that a specific term began circulating which was supposed to emphasize the combination of these technologies. Cloud computing was born out of the idea of a distributed computing system where information was available from numerous different points. Although the idea has certainly caught on – there are still some misconceptions and confusions around the cloud.
Many businesses have found great ways to utilize a cloud model. Now, they’re able to be more agile, grow faster and even add to their business resiliency. Still, there are those that have never really worked with an enterprise cloud model and are held back by myths and confusion points around the technology.
In HP’s Five myths of cloud computing, we learn some of the biggest myths currently circulating in the cloud industry. Remember, the cloud is a vast, diverse, model which can accommodate many different types of organizations. Whether it’s a private, public, hybrid or community cloud – there may be a fit for your organization. Still, without full understanding the cloud model, it’s easy to be confused by so many different types of offerings.
The Five myths of cloud computing whitepaper outlines the key areas where IT managers and business stakeholders should seek more clarification. Specifically:
  • Myth 1: The public cloud is the most inexpensive way to procure IT services
  • Myth 2: Baby steps in virtualization are the only way to reach the cloud
  • Myth 3: Critical applications do not belong in the cloud
  • Myth 4: All cloud security requirements are created equally
  • Myth 5: There is only one way to do cloud computing

12 hard truths about cloud computing | Cloud Computing - InfoWorld

Performance, security, cost -- here’s what to really expect from the cloud

For the past few months, I've been poking around the various commercial clouds, buying new machines, trying software, and running benchmarks. Well, not exactly buying machines -- just renting them for a few hours and plunking down a few pennies on the barrelhead.

Along the way, I noticed it wasn't working out the way I expected. The machines aren't as interchangeable or as cheap as they seem. Moving to the cloud isn't as simple or as carefree as it's made to be. In other words, the machines weren't living up to their hype. Anyone who's been chugging the Kool-Aid and dreaming that the word "cloud" is a synonym for "perfection" or "pain-free" is going to be sorely disappointed.

[ Stay on top of the cloud with InfoWorld's "Cloud Computing Deep Dive" special report. Download it today! | From Amazon to Windows Azure, see how the elite 8 public clouds compare in InfoWorld Test Center's review. | For the latest news and happenings, subscribe to the Cloud Computing Report newsletter. ]

This isn't to say there's no truth to what the cloud companies proclaim, but there are plenty of tricky details that aren't immediately obvious. At their core, the machines aren't miracle workers, just the next generation of what we've been using for years. The improvements are incremental, not revolutionary. If we dial back our hopes and approach the machines with moderated expectations, they're quite nice.

To keep our expectations in check, here is a list of what to really expect from the cloud.

Cloud computing hard truth No. 1: Machine performance isn't uniform
The cloud is meant to abstract away many of the choices that normally go into shopping for a server. You're supposed to push a button, choose your operating system, and get the root password. Everything else is supposed to be handled by the cloud, a nebulous Great Oz that takes care of all those computational chores behind the curtain.

The one thing the benchmarks have taught me is that machines behave quite differently. Even if you buy an instance with the same amount of RAM running the same version of the operating system, you'll find startlingly different performance. There are different chips and different hypervisors running underneath everything. Then the companies can load up their boxes with different numbers of virtual machines.

Cloud computing hard truth No. 2: Too many choices
Sure, many machines pretend to be commodities, but what does it really mean for something to be a high-CPU machine? Then there's the CUDA architecture.

Here, the great promise of the cloud rings true: You can rent something souped-up by the hour and see what it can do. Your boss may not want to give you the money to actually purchase a rack of Nvidia cards to test out the parallel-processing power of the CUDA architecture. A rack of video cards on the purchase order looks like it might support too many time-wasting games of Call of Duty. But a few hours on an Nvidia cloud box is an easy decision for a purchase manager to make.

Expect more complicated hardware as the infatuation with big data grows bigger. Renting out the machines by the hour is an ideal way to get people interested in trying the devices. But with increased choice comes increased complexity and increased uncertainty about what is truly needed.

Tuesday, 26 March 2013

Big Data Market Expected to Reach $48.3 Billion in 2018 | CloudTimes

By On February 5, 2013

Large data sources in the world today are vast. Companies today have access to a vast range of innovative resources that give companies the ability to gather and analyze consumer information. They can act as a continuous incoming data from the measuring device, the events from radio frequency identifiers, message streams from social networks, meteorological data, remote sensing, data flows on the location of subscribers of cellular networks, devices, audio and video recording.

A report by Transparency Market Research forecast that over the next five years, the Big Data technology market in the world will reach $48.3 billion from a comparatively modest $6.3 billion last year, a CAGR of 40.5 percent from 2012 to 2018.

According to Transparency Market Research, five companies including HP, Teradata, Opera Solutions, Mu Sigma and Splunk, commanded more than 60 percent of the big data market in 2012.

The technology market and big data services represents a global opportunity for rapid growth of billions of dollars. North America is expected to lead the market through 2018, accounting for more than 54 percent of all revenue. But Asia-Pacific region will outpace other markets by growing at a CAGR of 42.6 percent from 2012 to 2018. Flourishing outsourcing industry, distributed manufacturing hubs and lenient regulations on data sharing are the several factors creating significant opportunities in the Asia-Pacific region for the big data market, says the report.

Most of the data that will be made in the period from 2012 to 2018 will generate not people, and cars in the course of interaction with each other and other data networks. These include, for example, healthcare market financial services, manufacturing, healthcare, telecommunication, government, retail and media & entertainment.

The multiple and varied stakeholders including the medical and pharmaceutical product industries, providers and patients, all generate pools of data. A major portion of the clinical data is not yet digitized and so big data tools are helping these stakeholders to use the pool of data effectively.

Despite the prospects offered by new technologies, the IT leaders will certainly have to consider some aspects. Talent – one of them. According to IDC research, only 0.5% of the world’s information currently is being analyzed. The need for people with deep analytical expertise in the field of high technology and efficient use of big data tools is limiting the growth of this market.

APAC enterprises unprepared for big data | ZDNet

Summary: Region's firms anticipate big data growth and need for in-depth analysis, but have no strategy as yet to manage data to glean business insight, new study finds.
Companies in the Asia-Pacific region are ill-prepared for the advent of big data, with demand for comprehensive management and analysis of mounting data volumes expected to outpace the actual ability of their information management systems to do so, a new survey has found.
Market research firm IDC said while 67 percent of respondents believed their existing storage infrastructure is sufficient to meet business needs for the next 12 months, 72 percent admitted they did not have strategies to cope with the anticipated growth of unstructured "big data"--data that is increasingly important as a competitive resource for data mining and other business uses.
And with data growth having outrun their ability to manage it effectively, 64 percent of those surveyed said their need for in-depth analysis of the data has outpaced the ability of their current IT systems to ensure the data is relevant, timely and useful.
These findings were unveiled Wednesday in its report, "The Changing Face of Storage: A Rethink of Strategy that Goes Beyond the Data". Commissioned by Hitachi Data Systems (HDS), the study was conducted in September and polled a total of 150 IT executives from companies in six countries in the region--Singapore, Malaysia, China, India, Australia and New Zealand.
The results have shown there are varying levels of maturity and understanding of storage management in this region, Simon Piff, associate vice president for enterprise infrastructure research at IDC Asia-Pacific, said in a statement.
Considering that the challenges of ensuring data relevancy and managing data growth were ranked among the top five common issues, it is clear that the anticipated trend toward big data is "something few are ready to take on", he pointed out.
Across all Asia-Pacific markets surveyed, 56 percent cited data growth as their main challenge. Increasing utilization levels and managing storage for virtualized servers were also hurdles for 39 percent and 36 percent of respondents, respectively.
Data needs to be stored, governed and managed for insight and innovation in order to drive strategic and competitive value, Kevin Eggleston, senior vice president and general manager, Hitachi Data Systems Asia-Pacific, said in the same statement.
To that end, HDS has realigned its cloud portfolio--a three-tiered strategy of infrastructure cloud, content cloud and information cloud--to help enterprises to manage data growth as well as collect and connect disparate pieces of data such that they become an asset for business insight and innovation, he noted.
Storage's "conservative" attitude needs change
IDC's Piff told ZDNet Asia in an separate e-mail interview that there are various reasons why Asian firms have found themselves in a position where they are not ready for the big data deluge.
According to him, storage has typically been the final part of any solution decision-making equation, selected based on a long list of prerequisites just for the demands of a specific app. Server virtualization has changed all that, as suddenly storage needed to be networked and no longer tied to an application, he explained.
"This radical departure from time-tested storage management theory has put many storage environments into a state of shock at the impact server virtualization [has caused]," he said. "Consequently there's a lot of catching up to do by several organizations just to have storage deliver what the new virtual systems demand."
This problem is compounded by the stance that storage is "probably the most staid and conservative department within IT, and is very slow to change", he added.
Piff noted that another possible reason is that while many "efficiency technologies" including deduplication, virtualization and thin provisioning could have been utilized, they often get cut from final budgets and basic disks are deployed instead. The lack of these technologies means more time is spent managing storage "hygiene" and less on considering strategic choices, he pointed out.
Asked when will Asia-Pacific companies finally be ready to manage and mine big data effectively, Piff replied there are a number of dependent criteria. "Budget is of course the biggest issue, and with the current economic uncertainties, it is unclear if anyone has the necessary appetite to update their environments."
The other factor is mindset, he emphasized. Technologies to improve storage efficiency have already been around for a while and budget notwithstanding, IT departments--especially storage professionals--have to come to terms with the radical transformation and velocity of change that virtualization and cloud computing have brought and their impact on storage. That, in turn, requires a change of attitude toward data management.
While it may only take a relatively short while to solve big data management issues, "the motivation needs to be there" first, Piff concluded.

Forrester Report Calls for IT Culture Overhaul -

By Marissa Berenson
Wed, August 06, 2008
IDG News Service —
As many as 85 percent of survey respondents believe the culture of IT can differ from the overall culture of a firm, according to a recent report by Forrester Research entitled "Does your IT culture need an overhaul?"
In fact, IT department culture is probably not a match with overall corporate culture in about half of all businesses, Forrester analyst Marc Cecere estimated. The research firm interviewed 15 CIOs in depth and surveyed 41 IT decision makers for the study, which defines corporate culture as the way individuals feel themselves to be part of a company's identity.

"Sometimes your IT is organized around efficiency and your business is organized more around responsiveness," Cecere said in an interview.
A distinct IT culture may evolve in a firm due to the different ways each department measures success. And, in a large company where leadership varies among departments, cultural gaps are almost inevitable, Cecere noted.

However, the report states, problems can arise when the culture of IT strays too far in three directions:
-- Too IT-centric, insular or fearful - When IT doesn't have a healthy relationship with the rest of the enterprise, it's in danger of forming what Forrester calls "an us-versus-them culture where IT hunkers down behind the technologies they manage, problems they solve, and metrics like help desk tickets served, system capacity, uptime, and volumes."
-- Too heroic, free range or autonomous - The dangers inherent in this style are a tendency to firefighting and working extreme hours to solve problems for customers. This can also spawn a tendency to developing workarounds, rather than understanding and fixing the underlying issues.

-- Too bureaucratic - IT departments can isolate themselves from the business if they set up too many formal processes that customers must follow. In the interests of comprehensiveness or security they may ask customers to submit overly complex requirements definitions and the like, but this can create unnecessary barriers between business needs and IT solutions, according to Forrester.

Cecere believes a company operates most effectively when the cultures between IT and other departments are in sync.

"At a minimum, the cultures shouldn't be in conflict with each other," he said.
So, how does a CIO go about overhauling IT culture?

The first step, Cecere said, is to clearly identify the cultural gaps, examining differences in decision-making styles and levels of risk between IT and other departments.Once identified, strong leadership and clearly defined metrics of success will aid in closing those gaps, as will a strong network of people within IT who share information with the CIO on a regular basis.

"It's what I call institutionalizing communication," he said. "It's more than just communicate, communicate, communicate, which you hear all the time. It's actually being very disciplined and very organized about it."
Performing such an overhaul, though, requires patience, as Cecere admitted a cultural shift is "a long process."
"You can change systems quickly compared to how fast you change culture," he said, "because culture is a lot about how people act when you're not looking at them."

Sweden drops the word 'ungoogleable' following pressure from Google | The Verge

The Language Council of Sweden has dropped the term "ungoogleable" from its list of new words, following pressure from Google to adapt its definition to something more flattering for the company. According to Sveriges Radio, Google wanted the meaning of the term ogooglebar — which describes something "that you can't find on the web with the use of a search engine" — to be altered so that it would only describe searches performed using Google's own search, something that the Language Council was not willing to do.
Language Council head Ann Cederberg said engaging Google's lawyers took "too much time and resources," prompting it to remove the phrase from its 2012 list of new words. But that won't be the last you hear of it. Cederberg is well aware that "ungoogleable" is already a popular word in Sweden, and Google will not be able to stop locals from using it. It's an unfortunate position for Google to be in; despite wanting to become the brand most associated with web searching, the company has fought to protect its name so that it can avoid it becoming a generic trademark, something that zipper, escalator and aspirin have all fallen foul of.

Is Tim Cook no more than an 'administrator'? | ZDNet

By for Between the Lines |
Has Apple truly lost its way with Tim Cook at the helm?
A former Microsoft executive has made this claim. Writing a guest piece for Forbes, former Chief Operating Officer Bob Herbold at Microsoft -- from 1994 to 2001 -- believes that the iPad and iPhone maker simply hasn't been the same since the former CEO was in control.

Herbold argues that while Apple stock continues to lose its glitter in the eyes of investors, data and numbers only tell us half the story. Stock prices, he argues, are not based solely on product line success or the balance sheet, but also relate to the perceived future of a company -- and in order to keep shareholders interested, the belief that a firm has innovative and visionary leadership is a crucial component.

While the late Steve Jobs is called the "ultimate visionary leader," Tim Cook, who replaced Jobs after the co-founder passed away in 2011, is implied to be nothing more than an office body. Although Cook is not mentioned by name, Herbold claims that Apple requires "a visionary leader, not an administrator."
"The leader needs to be paranoid about making the core offerings of the organization more exciting and more impactful with its customers," Herbold writes. "That sounds simple, but doing it with clarity and speed is absolutely necessary. You must avoid any kind of bureaucracy that can water down the impact of your efforts or slow it to a snail's pace."

In addition, Herbold says that in order to be the type of "visionary" leader modern-day businesses require, the CEO doesn't have to be a technology genius, but does have to have a high business acumen. Powerful, long-reaching business strategies are necessary, and deep, personal involvement with the details of your corporation is a must.

Comparing former IBM CEO Lou Gerstner and Steve Jobs, the former Microsoft exec says that the time spent getting to know your customers, their opinions and their needs, can improve the success of your products. Gerstner spent three months simply talking to customers about their information-technology challenges and based IBM's strategy on this, whereas Jobs personally led the design and development of Apple's consumer products -- and perhaps Cook has fallen short of this expectation.
As a parting shot, Herbold says that business managers must have "the guts to lead" to keep a firm competitive. It isn't about having charisma, but being strong-willed and knowing what you want to accomplish in the long-run.

The former Microsoft executive finishes by commenting:
"Apple could surprise us in the next six to nine months by emerging with yet another big new idea. On the other hand, I think the stock market is telling us that the public is beginning to believe that Apple really doesn't have strong visionary leadership. Apple will be a solid technology company but the Apple era may be on its way out."

Google's Ad Chief 50 Percent of Ads Will Go Online in the Next 5 years - Mike Isaac - Dive Into Media - AllThingsD


Ads are part of the very fabric of our society, and have been for years. But ad execs want to stick with what works: It’s why the bulk of today’s industry ad budgets are still pointed at traditional mediums like television, print and the like.
Not for long, according to Google SVP and chief business officer Nikesh Arora, in conversation at the D: Dive Into Media conference on Tuesday.

“There’s currently about $800 billion in the global advertising market today. That’s a very large number, but online advertising accounts for less than $100 billion of that number,” Arora said. “There is a reasonable probability that over 50 percent of advertising goes online in the next five years.”
Ambitious, to say the least. We’ve been stuck in the same model for the better part of the past century, and companies like Google have spent the last decade trying to convince the ad industry that, yes, the Web can indeed make you money. That’s exactly what YouTube’s Robert Kyncl has been pitching with the online video arm of Google and its channels initiative, not to mention Google’s other potential ad businesses (which have indeed been successful).

Funny, considering Arora wasn’t about to say what online advertising would look like 10 years from now. He just knows that whatever it’ll look like, it’s going to be successful.
He gave a bit of insight into how it’ll get there though. “The big tipping point we’re waiting for is Internet connected televisions,” Arora said. “We’re waiting for things going from ‘nice-to-have’ to ‘must-have.’” So basically, when his company can get that whole Google TV thing to take off — or perhaps others in the space wanting to do the same — we’ll see the tides of change begin to shift.
Check back with you in five years, Nikesh.

Australia top in Cloud computing adoption: BSA - findings, cybersecurity, policy, BSA, security, Cloud, study, government, cloud computing, ranking - ARN

Study shows Australia ranks second in Cloud computing adoption, behind Japan

Australia is one of the world leaders in Cloud computing policies, maintaining its second place ranking just behind Japan in the global Cloud computing standings, a new study by the Software Alliance (BSA) has found.

The 2013 BSA Global Cloud Computing Scorecard, which analysed the shifting global policy landscape for Cloud computing, evaluated 24 economies globally in seven policy areas critical to the market for Cloud computing services.

The study appraised Australia for its up-to-date cybercrime regime, bolstered by its sanction of the convention on cybercrime. It also was recommended for its international harmonisation of rules – its security ranking received a boost when the government dropped plans for mandatory Internet filtering.

The Federal Government also most recently announced a new cyber security centre in Canberra and an additional $1.46 billion in funding for cyber security as part of a new national security blueprint.
BSA Asia-Pacific senior director for government relations and policy, Roger Somerville, said although Australia has improved on many areas in the scorecard scale by adopting and enhancing policies that are conductive to Cloud innovation, there remains room for improvement.

“Every country’s policies affect the global Cloud marketplace, so it is imperative for Australia to continue to focus on improvements.“We encourage the Australian government to continue to commit to public sector Cloud use and adoption, similar to the US government’s approach of adopting a Cloud first policy. This will help Australia maintain, if not improve, on its ranking and help grow the global Cloud,” he said.

The study also found that the sharp divide between advanced economies and the developing world, revealed in the 2012 scorecard, has narrowed this year.
Somerville claimed it is a result of significant progress made by some developing countries, many of which are in the Asia-Pacific, along with the stable progress in major developed countries.

FinFET with the world's smallest characteristics variability

(Nanowerk News) Takashi Matsukawa and Meishoku Masahara and others, Silicon Nano-Device Group, the Nanoelectronics Research Institute of the National Institute of Advanced Industrial Science and Technology (AIST), have developed a prototype of a 14 nm-generation 3D transistor (FinFET) with the world’s smallest variability of characteristics (Fig. 1 left)
Figure 1: Low-variability FinFET prototype (left) and a comparison with past reports on characteristics variability intensity (right).
A primary cause of the characteristics variability in a FinFET is the variability of physical properties of the metal gate electrode material. An amorphous metal material for the gate electrode that has a small variability of physical properties has been developed and the prototype FinFET with the world’s smallest characteristics variability was fabricated using the material. With integrated circuits beyond the 14 nm generation, including SRAMs (static random access memories), major issues have been the hindrance to performance improvement and the reduction in yields, both due to the characteristics variability of elements. The present results are expected to contribute to solving these issues.
Details of this technology have been presented at the 2012 International Electron Devices Meeting (IEDM 2012) held in San Francisco, U.S.A., from December 10 to 12, 2012.
Social Background of Research
Until now, the performance improvement and the integration scale increase of silicon integrated circuits have been achieved by miniaturizing the transistors, which are the smallest elements of these circuits. The miniaturization of transistors is also linked to cost reduction, and thus fierce competition to develop more miniaturized transistors continues. However, in the technology used for the 14 nm-generation transistor, which is expected to be in production after 2017, there will be issues with variability of characteristics among transistor elements due to their small size. There is concern about the hindrance to the performance improvement of products and the reduction in yields. In particular, SRAMs, which occupy more than 50 % of the area of system LSIs (large-scale integrated circuits) and microprocessors, can be easily affected by the variability of characteristics owing to the intensive use of smallest transistors in their circuits. For these reasons, there is strong need to develop miniaturized transistors with small characteristics variability.
History of Research
AIST has been conducting research in transistors with new structure called FinFETs that have a three-dimensional structure. In addition, AIST proposed a four-terminal FinFET to electrically control the parameter of the transistor, and it demonstrated the operation of this transistor in 2003. Furthermore, AIST proposed that the variability of physical properties of the metal gate electrode material was a new factor affecting the characteristics variability of FinFETs in 2008. This was followed by the proposal of a new transistor-manufacturing technology that enabled the reduction of the variability of physical properties. AIST also identified the primary factor causing the variability of on-state current in 14 nm-generation FinFETs (AIST press release on December 8, 2011). AIST has since been continuing its research and development efforts to suppress the variability of characteristics in FinFETs.
This research was conducted as part of the project “Technology Development of New Nanoelectronics Semiconductor Materials and New-Structure Nanoelectronic Devices” (FY2009 - 2011) commissioned by the New Energy and Industrial Technology Development Organization.
Details of Research
Characteristics variability, which has become significant in miniaturized transistors, is mainly classified into two types, manifesting itself as off-state current variability and on-state current variabitity and having adverse effects on the performance of integrated circuits. The variability of off-state current exponentially increases the off-state current of some of transistors in an integrated-circuit chip relative to the design value, thus dramatically increasing the standby power consumption of the entire chip. Further, because the operational speed of an integrated circuit is limited by the transistor having the lowest on-state current in the integrated circuit, the variability of on-state current reduces the operational speed of the circuit to lower than the design value. In other words, the advancement in the miniaturization of transistors has been accompanied by serious problems, such as operational speed has not increased while power consumption has.
In FinFETs introduced with the 22 nm generation, the primary cause of characteristics variability is the variability of a physical property, called the work function, of the gate electrodes. The threshold voltage, one of the critical electrical properties of transistors, is determined by the work function of the metal gate electrode material. The material generally used in the metal gate electrodes has a polycrystalline structure, in which the interfaces of individual crystal grains (grain boundaries) have a variety of work functions, resulting in threshold voltage variability (Fig. 2).
Primary factor causing the threshold voltage variability in a FinFET
Figure 2: Primary factor causing the threshold voltage variability in a FinFET.
Therefore, instead of conventional polycrystalline metals (such as titanium nitride, TiN), the researchers used an amorphous metal (tantalum silicon nitride, TaSiN), which has no grain boundary, for the gate electrodes. They then compared variabilities of electrical characteristics. Figure 3 shows electron microscope images of the fin cross-section with the amorphous TaSiN metal gate electrodes developed in this project, compared with that of the conventional polycrystalline TiN metal gate electrodes. The amorphous TaSiN metal gate electrodes were uniformly formed on the side walls of the fin channel, and the grain boundaries that cause the variability in the TiN electrodes is not found. Further, with the TiN electrodes, electron beam diffraction of a periodic and discrete spot pattern can be observed, reflecting the crystal structure. However, with the TaSiN electrodes, only a ring-shaped electron beam diffraction pattern can be observed, suggesting the lack of periodicity and indicating that the amorphous TaSiN metal gate electrodes are appropriately formed on the fin channel.
Comparison between the developed FinFET with amorphous TaSiN metal gates and the conventional FinFET with polycrystalline TiN metal gates
Figure 3: Comparison between the developed FinFET with amorphous TaSiN metal gates and the conventional FinFET with polycrystalline TiN metal gates.
Among the electrical characteristics variabilities of developed FinFET prototypes with a variety of design dimensions, threshold voltage variability and transconductance variability were analyzed. Figure 4 is a Pelgrom plot of the measured threshold voltage variability. Smaller slope of the plot indicates smaller characteristics variability. Through the use of amorphous TaSiN metal gates, the threshold voltage variability was considerably reduced compared with polycrystalline metal gates. It showed the best minimum value (1.34 mVµm) ever reported for FinFETs (Fig. 1 right). This value fulfills the requirement for proper operation of a 15 nm-generation SRAM. In other words, this technology contributes to a breakthrough in the solution of the threshold voltage variability issue accompanying the miniaturization of transistors.
Pelgrom plots comparing the threshold voltage variabilities of the amorphous TaSiN metal gates and the conventional polycrystalline TiN metal gates
Figure 4: Pelgrom plots comparing the threshold voltage variabilities of the amorphous TaSiN metal gates and the conventional polycrystalline TiN metal gates.
Figure 5 shows a comparison of transconductance variabilities for gate lengths of 100 nm and 50 nm. As with the threshold voltage variability, the transconductance variability also increases as the gate length decreases; however, through the use of amorphous metal gate electrodes, the variability increase can be drastically reduced. Since transconductance variability is a primary cause for the variability of on-state current in 14 nm-generation (and beyond) transistors, these FinFETs using amorphous metal gate electrodes are expected to help solve the on-state current variability issue in the 14 nm-generation (and beyond) transistors. In addition, they are expected to contribute to solving the low yield problem of integrated circuits such as SRAMs and to achieving low power consumption while enhancing the performance of integrated circuits. This technology is available not only to device manufacturers but also to manufacturers of semiconductor-fabrication systems, materials, and measuring and test devices. The researchers are conducting this research having technology transfer to and collaboration with these corporations in mind.
Suppression effects on transconductance variability by the developed amorphous TaSiN metal gates
Figure 5: Suppression effects on transconductance variability by the developed amorphous TaSiN metal gates.
Future Plans
In the future, the researchers will fabricate integrated circuits using FinFETs aiming at the circuit level validation of their reduced power consumption and improved yields.