Category: Strategy


NHS green-lights Cloud

Home Page, Strategy

NHS green-lights cloud technology for storing patient data

NHS green-lights cloud technology for storing patient data

NHS Digital has issued new national guidance for health and care organisations considering cloud services for storing patient information.

The document outlines a framework for assessing and managing risk around the use of public cloud technologies in the health and social care sectors in England, including legalities around how data should be stored and used and considerations to be made by organisations when choosing a supplier.

The guidance also contains best practice principles for handling customer data and highlights considerations to be made by trusts prior to the introduction of the General Data Protection Regulation (GDPR) on 25 May.

The cloud has been widely embraced in other UK industries under the government’s 2013 ‘cloud first’ policy for public sector IT.

While some parts of the NHS already make use of the cloud – such as NHS Choices and NHS England’s Code4Health initiative – the publication of the national guidelines marks the first time the technology has been approved for widespread adoption within Britain’s health service.

NHS Digital said the guidelines would “enable NHS organisations to benefit from the flexibility and cost savings associated with the use of cloud facilities.”

Central to the policy is that cloud suppliers used by NHS organisations must host their data in the UK, or European countries that provide an adequate level of protection as agreed by the European Commission.

Cloud suppliers covered by the Privacy Shield in the United States are also deemed safe for use.

All vendors are required to use cryptography to protect communications and undertake annual security assessments against recognised standards, such as the International Organisation for Standardisation (ISO) or the UK Government’s Cyber Essentials. Suppliers must undertake regular monitoring procedures and keep customers up-to-date with any changes to the service that could impact the security of the IT system and data.

Further, Local Senior Information Risk Owners (SIROs), in conjunction with Data Protection Officers and Caldicott Guardians, must be satisfied with the security arrangements the cloud service provider being considered, using National cyber security essentials as a guide.

Rob Shaw, Deputy Chief Executive at NHS Digital, said: “It is for individual organisations to decide if they wish to use cloud and data offshoring but there are a huge range of benefits in doing so, such as greater data security protection and reduced running costs when implemented effectively.

“The guidance being published today will give greater clarity about how these technologies can be used and how data, including confidential patient information, can be securely managed.”

NHS Digital’s guidelines have been published in partnership with the Department of Health, NHS England and NHS Improvement.

They come at a time when NHS trusts are increasingly look to cloud as their next big IT project, allured by the technology’s promise of enabling rapid scaling-up without the associated hardware costs.

In a recent report from Digital Health Intelligence, a third of NHS surveyed said they were already delivering part of their infrastructure through the cloud, while 39% of the organisations said they planned to introduce some element of cloud-based infrastructure within the next two years.

In addition to lowering hardware costs and improving the ability to recover data in the event of a local system failure, it is argued that cloud technology could take strain off of overstretched GPs by giving them more freedom to work remotely.

A digital transformation checklist

Home Page, Projects, Strategy

The market is filled with great insights on successful digital transformations. A recent report from McKinsey weighs the risks and benefits. Another, from MIT Sloan Management Review, compares digitally mature organisations with those still digitally adolescent. As with most adolescents, those less mature are noisier and messier, with the promise of better things to come!

Where should you start when assessing the opportunities and risks for your organisation? In particular, what are the balance points, for and against, digital transformation? What is the expected speed of any such change? How do you plan? (Can you plan?)

Here’s a high-level checklist — factors that influence success when driving a digital transformation:

Complexity v. Benefits: be clear about what you seek from a digital transformation for your organisation. Consider what such a transformation needs. Define the return on the investment and the effort you will expend in the transformation. Factor in how you might scale as you transform (and arrive at your end-state), and the infrastructure you need to deliver.

Architecture Speed v. Architecture Integrity: does it make sense that your organisation adopts the speed of a start-up? (Many seek to do exactly this when considering digital transformations, according to McKinsey). But can your organisation even consider this option? How should you balance system robustness and speed to market? Which is more important? What’s your appetite for risk, and how do you define what risk your organisation is prepared to consider in its systems architecture?

Predictability v. Flexibility: from which perspective do you wish to build your system architecture or data centre infrastructure? Do you want structured IT processes and systems that you own and manage, or a scalable public cloud that allows you to throttle up or down? Or a hybrid? What role will hyperconvergence and software-defined virtualisation play?

What to Keep v. What to Replace: Audit your existing IT infrastructure. Again, is it in-house, a “pure” cloud, or a hybrid? More-importantly, assess whether the cost and complexity of your existing infrastructure outweigh the benefits it delivers. Check how you quantify those benefits. Assess how much you spend simply on running hardware in data centres. (According to IDC, this can be as high as 80% of an IT budget.) If existing IT infrastructure has ended up in multiple silos, you’re ready for a digital transformation.

Accountability, Clarity & Visibility: how do you make the transformation team accountable for the project’s success? Include senior management, which is collectively accountable for the clarity of the vision, strategy and business case. Make everyone on the broader team accountable for the visibility of the project within the organisation (and therefore how well it’s understood and accepted).

Measurement: none of this is simple, so don’t seek a simplistic set of measurements. Reflect the complexity of the transformation in how you measure the project’s progress and success. Channel that complexity to track progress, highlight bottlenecks, test assumptions and benchmark performance. Be clear on how each of these interact with each other.

Financials: finally, explicitly link your digital transformation projects to revenues and profits. Be clear about how you expect to grow revenues, reduce costs, or enter new markets.

The primary goal of any digital transformation strategy should be the business benefits you seek to gain. Digital transformation should make your organisation faster in responding to opportunities or threats. Once established and clearly communicated, the readiness of your organisation to start and finish the transformation will follow.

Published on September 30, 2016 Sumir Bhatia Vice President, Data Centre Group, Asia Pacific at Lenovo.

10 Tips For Giving Effective Virtual Presentations

Me, Projects, Strategy

What to know before you go live.

An illustration of a computer screen with messy notes and graphs around it.
Presenting online? Try these suggestions to improve your results. | Illustration by Tricia Seibold

As audiences go global and you need to reach more people through technology (including webinars, conference calls and teleconference), you must consider the challenges to connecting with a virtual audience. Here I pinpoint 10 valuable best practices to ensure you communicate successfully.

1. Be Brief

Audiences begin to lose attention after roughly 10 minutes of hearing from the same presenter. If you have more than 10 minutes of content, use interactive activities to keep your audience engaged (for example, take a poll, give quizzes, or ask audience members for their opinions via chat).

2. Be Simple

Keep slides simple — avoid too many words, graphics and animation features. Less is definitely more!

An illustration of a lamp
Light yourself well | Illustration by Tricia Seibold

3. Be a TV Personality

Look straight into your camera, not the screen. Wear clothing that is neutral in color (no plaids or stripes). Light yourself well and from above. Be mindful of what appears behind you in the background. Invest in a good microphone.

4. Be Standing

Even though your audience cannot see you, stand when you present. This allows you to stay focused and use good presentation delivery skills such as belly breathing, vocal variety, and pausing.

5. Be Prepared

Practice delivering your presentation with your technology in advance of your talk. Make sure all of the features of the technology work. Record your practice using the recording feature of your tool. Watch and listen to learn what works and what you can improve.

6. Be Assisted

Have someone available to deal with technical issues and to field email/text questions. Also, if you have multiple remote audience members in one location, be sure to pick one of them to be your “eyes and ears.” Ask them to queue up questions and facilitate discussion on your behalf.

7. Be Specific

Ask pointed questions to avoid too many people answering at once. For example, rather than ask, “Are there any questions?” try “Who has a question about the solution I provided?” Set a ground rule that people state their names prior to speaking.

An Illustration of two pictures of people.
Imagine your audience | Illustration by Tricia Seibold

8. Be Synchronized

Transitions are critical. You must connect what you just said to what is coming next when you move from point to point. Transitions between topics and slides are good opportunities to get people reengaged to your talk.

9. Be Connected

Imagine your audience even though you can’t see them. You can place pictures of audience members behind your camera so you can look at people as you present.

10. Be Early

Encourage your audience to access your call or webinar in advance of the start time so you can iron out any technical issues in advance and get them familiar with the technology.

September 26, 2016|by Matt Abrahams is a Stanford GSB organizational behavior lecturer, author, and communications coach.

The 3 pillars of successful Service and Support

ITIL, Strategy

Failing to deliver on service and support can be extremely costly for any organisation, where according to statistics 91% of unhappy customers will not willingly do business with you again (Lee Resources).

So in today’s competitive landscape, what exactly should you be expecting from your Call Recording and Workforce Optimisation Service and Support providers? With over 60% of Business Systems’ personnel residing in this division, we outline the 3 pillars of successful Service and Support which we (and our customers) have come to recognise first hand!

1. Strategy & Design

They say ‘by failing to prepare, you are preparing to fail’. The same maxim also holds true for major projects taking place in your organisation. Without a solid strategy and design in place to guide your project plan, desired objectives and budgets (due to nasty unexpected delays and costs) will not be met. With a good service provider, you take advantage of the years of experience in implementing similar solutions, with skilled consultants helping you to design the project so that you don’t have unwanted surprises. Moreover, once you build an ongoing relationship with your provider, you gain access to timely advice on how best to address emerging technological, regulatory operational trends in your industry.

2. Project Management, Implementation & Deployment

In order to ensure each implementation is as straightforward as possible, the service provider should assign a dedicated and qualified Project Manager to ensure that the project is managed and implemented professionally, timely and to budget. It is also important that the project team holds an ethos of responsibility which involves taking ownership of technical issues, providing onsite management and having a proper escalation process in place if and when any faults arise during the project life cycle. In addition to this, a reliable service provider should always be providing you with updates and reports on key milestones project life cycle on a regular basis.

3. Technical Support Services

When dealing with a service provider it is extremely important to consider a number of factors regarding their technical support capabilities including:

  • SLA adherence– Beyond listing expectations of service type and quality, does your provider’s SLA specify remedies for when requirements aren’t met?
  • Service delivery– Do they have a 24/7/365 service delivery capability?
  • Geographical coverage– How many engineers do they have operating across the country?
  • Spare parts holding– Do they have readily available spare parts to ensure fast support if and when a component of your system goes wrong?
  • Comprehensive offering– How extensive are their capabilities; for example, can they provide end of life (EOL) support for discontinued solutions if needed?

Service and Support excellence is the foundation on which long-term customer relationships are built, and as a customer it can have a huge impact on the return of your investment as well as total cost of ownership.

If you need to find out more about what you should be expecting from your service provider and the different levels of support available, then check out our Service & Support webpage.


Posted on June 15, 2016 by Business Systems UK


Home Page, Projects, Strategy

The Project Management Office (PMO)Program and Project Management Offices (PMOs) have been in the news. OK, you won’t have read about this in your daily paper, but in the UK the PMOSIG became incorporated as the Association for Project Management’s 13th specific interest group a couple of years back. While PMOs have been around for a long time, this was a big step forward for the recognition of the work they do. And they do a lot more than just produce reports.

The role of a PMO

A PMO is the backbone of a successful project management approach at an organization. It is a function that provides decision support information, although it doesn’t make any decisions itself. A PMO underpins the project delivery mechanisms by ensuring that all business change in an organization is managed in a controlled way. According to the Office of Government Commerce’s, (based in the UK) standard for Portfolio, Program and Project Offices, the most mature PMOs provide:

  • Governance: ensuring that decisions are taken by the right people, based on the right information. The governance role can also include audit or peer reviews, developing project and programme structures and ensuring accountability.
  • Transparency: providing information with a single source of the truth. Information should be relevant and accurate to support effective decision-making.
  • Reusability: stopping project teams from reinventing the wheel by being a central point for lessons learned, templates and best practice.
  • Delivery support: making it easy for project teams to do their jobs by reducing bureaucracy, providing training, mentoring and quality assurance.
  • Traceability: providing the function for managing documentation, project history and organizational knowledge.

So what does that actually mean in practice? PMO teams fulfill a variety of functions on a day-to-day basis including:

  • Gathering data about project progress and producing reports
  • Developing standards and processes
  • Encouraging (or enforcing where necessary) the use of those standards and processes
  • Managing resources for projects
  • Delivering training and mentoring project team members
  • Managing dependencies across multiple projects
  • Tracking benefits
  • Reporting on financial information such as return on investment.

As part of this, the PMO is also the guardian of Enterprise Project Management tools and project management methods. There will normally be an expert (or several) in the PMO who can support project managers and their teams with using any project-related software.

Different types of PMO

PMOs look different in different organizations, as you would expect. A recent study by ESI found that nearly 60% of companies have more than one PMO, so decentralization is by far the norm.

Over a third of PMOs have more than 10 members of staff, and the location of the PMO is evenly split between IT, another business function and at a corporate level, so PMOs can be found pretty much anywhere in an organization.

In some companies, the project managers report directly to the PMO, although this is not as common as you might imagine. More than half of the project managers in the companies surveyed by ESI reported in to somewhere else. The increasing maturity of the PMO function means that we are likely to see more and more project managers reporting into a PMO in the future, which in turn provides a better opportunity for standardization and embedding tools and processes.

Your PMO might be a central function reporting to the Board, or it might be a department within a division. You may have a hub-and-spoke model with a central PMO and divisional units in different locations. The PMO might even be a temporary team, put together to support a large program. It may incorporate a centre of excellence for training and standards, or that might be separate. In short, there are a number of different ways for a PMO to operate, and they all have the objective of providing operational efficiencies and supporting the successful delivery of change.

Whatever model you choose for your PMO, getting the implementation right will undoubtedly make the difference between a function that increases the success of projects and one that just focuses on retrospective reporting. A mature PMO can really help an organization make the most of the tools, methods and the skilled staff they have, by ensuring all these resources are used in the best possible way to support the organization’s strategic goals.

CIO: How to build your personal brand

Home Page, Strategy
As a CIO you have to carefully manage and nurture the perceptions that others have of you. By ‘others’ I mean internal staff and stakeholders, and outside of your company I’m talking about your peers, industry professionals, and thought-leaders. One of your key assets is your brand. Your brand expands upon more aspects of your character and your skillset than your job title, your CV and your career history. This article explains how you can build a brand that helps you to earn respect throughout your sector, and how the development of your brand can turn you into a sought-after personality in your marketplace.

Continue reading “CIO: How to build your personal brand”

6 CRM predictions for 2016

CRM, Home Page, Strategy

So what will be the big trends in CRM in 2016? Here are six predictions.

CRM software will become even more social. “In 2016, we’ll see a lot more CRM providers adding new social media features, whether that be tracking customer interactions or suggesting new contacts,” says Marc Prosser, cofounder, Fit Small Business. “Nimble is out ahead on this, but expect others to add these features while their team (and others) devise new ways CRM can take advantage of social media.”

Mobile CRM will become a must-have. In 2016, “we’ll see CRM go mobile in a big way,” says Prosser. “So far, most mobile CRM apps have focused on providing a basic phone-ready version of the desktop version, usually without the full set of features.” Over the next 12 months, however, “expect to see CRM mobile apps adding features that interact with map and note-taking apps.” Also, “CRM will become less hierarchical and easier to use on the go.”

Sales reps will rely on “mobile CRM [to] keep connected and in touch with prospects and their sales manager,” adds Sean Alpert, senior director, Product Marketing, Sales Cloud, Salesforce. “Real-time data [will] keep reps in the know about everything from usage rates to open service tickets to breaking news about the prospect they’re about to visit. And, mobile CRM [will become a] powerful sales tool as more and more reps eschew traditional slides in favor of showing a demo on their phone or pulling up the latest analytics or dashboards on their [mobile] device.”

Integration will be the name of the game. “It’s increasingly important that your CRM be able to seamlessly integrate with your ecommerce platform, your marketing automation software, your analytics software, your accounting system… the list goes on and on,” says Katie Hollar, CRM expert at Capterra, an online tool for businesses to find the right software. “Rather than spending hours downloading and uploading CSVs of data from one system to another, CRM users will demand that their provider build these native integrations with other platforms to make them more efficient. And if CRM vendors can’t keep up with the demand, users will switch systems, finding one that works better with their existing infrastructure.”

“CRMs will evolve from sales-oriented tools to truly integrated marketing and sales platforms,” predicts Kathleen Booth, CEO, Quintain Marketing. “There has already been some movement in this direction, with many CRMs, such as Salesforce, offering integrations with marketing software. But in the future, integrations will be replaced by all-in-one software platforms that truly marry the needs of sales and marketing,” she says. “One example of a company that is doing this successfully right now is HubSpot, which added a free CRM to its marketing software last year. Expect more companies to enter this market in 2016.”

Vertical CRMs will give traditional CRM solutions some serious competition. “In 2016, the ‘verticalization’ of CRM solutions will be accelerated,” says Adam Honig, cofounder and CEO of Spiro, a personal sales app for salespeople. “A real estate salesperson has different needs than a medical device salesperson, and companies are increasingly realizing that they could benefit from using industry-specific CRM solutions like Veeva, Vlocity and OpenGov,” he says. “These vendors’ built-in best practices and processes provide a level of expertise that companies just don’t get with a generic CRM solution.”

As a result, “horizontal CRMs will start being replaced by industry-specific vertical CRMs that help you navigate the specific challenges of your industry,” says Anatoly Geyfman, CEO, Carevoyance. “Healthcare is a big example of this,” he says. “Veeva, a CRM for the pharma [and life sciences] industry, was in the first wave of these, but the wave is not over.” Now, as a result of an influx of industry-specific software solutions, “even Salesforce is releasing industry-specific features and brands for its CRM product.”

More CRM platforms will be equipped with predictive analytics capabilities. “In 2016, CRM systems will have analytics engines behind them that will enable the ability to provide real-time offers to customers based on predicting what they will want next or what kind of product or service they might buy next,” says Rebecca Sendel, senior director, Data Analytics and Customer Experience Management Programs, TM Forum, a global industry association for digital businesses.

“Predictive analytics combined with CRM data gives marketers and salespeople the chance to learn, at a deeper level, customers’ habits and then react to those in real time,” says Vicki Godfrey, CMO, Avention, a provider of data solutions. “This makes for more personalized interactions, which leads to increased sales, better customer relationships and reduced churn rates.”

Look for the CRM of Things. “We’ve seen the Internet of Things (IoT) make major headway this past year, and CRM will begin to reap the benefits in 2016,” says Dylan Steele, senior director, Product Marketing, App Cloud & IoT Cloud, Salesforce. “Companies today want a complete understanding of their customers, and with billions of connected devices generating 2.5 quintillion bytes of data every day, it’s more important than ever to know how this data can create an even more personalized customer interaction.”

So expect to “see smart devices linked to CRM, enabling automated business notifications, follow-ups for sales support, and billing processes that will redefine immediacy for customer service,” says Kevin Roberts, director of Platform Technology at, a cloud ERP solution provider.

By Jennifer Lonoff Schiff CIO Nov 23, 2015 6:09 AM PT

True Cloud Architecture or just Cloud Hosted – Cirrus True Cloud

Home Page, Strategy

To take a single instance of a product, host it in a data centre and connect this up to your site or wide area network (WAN) you have simply changed the location of where your infrastructure sits. Yes a Telecity data centre in London will be far more secure than your own server room but the operating principles and single points of failure remain.
For most services that are not business critical or what are called “high availability” services, this works and is enough. For services that would have a serious business impact were they to go down for an afternoon as we saw yesterday – you need to look at True Cloud.

True Cloud is where you have multiple instances of your product in different locations, and services can be consumed from all the locations in real time. The key here is “real time”. It must be a live network, not a fail over plan of moving from one data centre to the other when something like yesterday occurs. This can take hours, plus the process of moving back at some point.

When looking at telephony, most would argue that standard office users whilst negatively impacted by an afternoon outage – could be managed via mobile devices and email communication. However, the contact centre is 100% business critical. To not serve or sell to your customers for an afternoon can have astronomical consequences.

Here is what a True Cloud network looks like:

  • 3 Data centres, located in different geographical locations. Manchester, Birmingham, London.
  • Any Cirrus end point (Cirrus desktop client (vDesk), mobile or landline DDI) can consume calls from all 3 data centres in real time.
  • All customers have an exact replication of their service on ALL 3 data centres. If a data centre goes down, your latest service settings are operated seamlessly from the remaining two data centres.
  • When breaking out the Cirrus network, we connect over 11 internet service providers (ISP’s) meaning your contact centre is not reliant on one single network to be operational. Public internet issues as seen yesterday can be managed and routed around in real time – not possible on a point to point SIP trunk set up over 1 network.
  • Cirrus talks to every end point in real time to check its connection, dynamically routing around issues and congestion to deliver our quality of service (QoS) guarantee.

No matter how large, how resilient or how many times the contract states “no single point of failure”. When paying a subscription cost model (effectively renting services), you should get access to a network, infrastructure & resilience that wouldn’t make financial sense to build yourself. With hosted contact centre you don’t, with Cirrus True Cloud you do. It really is that simple.

Cloud Applications – Out of Sight, Out of Mind?

CRM, Home Page, Strategy

Very interesting reminder….

I’m just completing a project where all the telephony for the client I am working for has moved to a cloud service provider – every part of the service from ACD, SIP Proxy, SBC and the associated reporting tools and user management all reside there.

It has been a very interesting project from both a technical and planning perspective, one of the biggest learns that I have taken away from it is the knowledge that even though a platform might sit in the cloud it is very important to consider the service that is being provided and what you, as the client needs to have in place to make it an overall success.

Of course it’s possible to have Cloud Storage as a Service or Telephony as a Service but before you look at taking on such a project have you thought of:

  • demarcation points
  • incident management ownership and accountabilities
  • operational resilience
  • how much your provider really knows your business

I would advocate a move to the cloud for many applications, however just because an application resides there it does not mean that it is ‘out of sight, out of mind’.

ITIL IT Service Management in 5 Minutes

Home Page, ITIL, Strategy

Organizations all over the world, from NASA to Disney, utilize ITIL to help improve their IT processes. But what is ITIL Service Management? Here’s what you need to know, and how you can use ITIL to benefit your own IT organization.

What Does ITIL Stand For?

ITIL is an acronym that stands for “IT Infrastructure Library”. It was originally developed in the UK as a series of books. These books explained procedures and best practices for the IT industry to follow. The goal was to standardize the management of IT, so everything wasn’t doing their own thing, but had a common set of IT standards to follow.

How Does it Work?

ITIL Service Management acts as a guideline for service delivery in the IT world. If you are committed to conducting best practices in the industry, ITIL is the way to go. As of today there are five different books, explained below.

ITIL Service Strategy

This portion of ITIL can be thought of as how an IT organization can best position itself for long-term success. Service Strategy discusses financial management and how to improve business relationships. It answers the question:

How can my IT department succeed over a long period of time?

ITIL Service Design

Designing IT systems should always involve a very important element: the user. Often times in when planning or designing a system, consideration is not made to specific intricacies of a business or its users. This section answers the question:

How can I plan my IT resources around my business?

ITIL Service Transition

When IT projects come to a completion phase, they transition to becoming an actual service that people in an organization will use. For example, when a project to migration to a new IT asset management system is complete, it is then “live” for users to begin working with. Service Transition works to answer the question:

How can I best transition an IT project over to a service for users?

ITIL Service Operation

Problems are a fact of life in IT. Without tech problems, most IT professionals would be out of a job! Service Operations is quite specific in helping provide service level agreement framework for your IT service desk. It’s where you go find the answer to:

How can my IT department meet SLAs?

ITIL Continual Service Improvement

No one wants to repeat mistakes. In IT, repeatable processes can be captured and used to improve efficiency and reduce the bottom line cost. Improvement is not always easy, however, and many IT departments need help with:

Written by Samanage

Microsoft Dynamics, the real ERP alternative to SAP

CRM, Home Page, Projects, Strategy

Last Thursday 19th of November, Microsoft gave at last more details about its new Microsoft Dynamics Ax and since them we can start giving more details about this new revolutionary ERP platform. Cloud based (but later alter mid 2016,  also On-Premise), modern HTML5 interface based on Office 365 look and feel, Power BI, Real Time Analytics with On Memory Database Technology, Office 365 & CRM Integration, Machine Learning and much more.

This new version present a huge leap in technology but maintaining the proven functionality that have helped thousand of companies along the world to optimise their processes and continue operating in a more challenging and more interconnected economic world.

The objective of this post is to focus how Microsoft Dynamics compares to SAP. The other big player on the  ERP market who is still dominant for implementations in big companies, specially in their homeland Germany where they have still about 50% market share and worldwide about 20%. That means a lot of new customers for Microsoft Dynamics when they get the opportunity to see that Microsoft Dynamics offers much more for their money.

Apart from other soft reports you would see out there, I can report from own experience. I work on the Microsoft Dynamics world since more than 8 years and I am actually based in Germany where I have contact with lots of companies that uses SAP but also I had an intensive experience with the new SAP version S/4 HANA with Fiori in the last 6 months.  Even I was in the SAP central in Walldorf to take part in an official SAP HANA course 🙂

I want to focus my comparation on several points. On the Cloud, the development experience, the BI platform, the On Memory database, the functionality, the availability of professionals and formation possibilities and last the qualitity of the IT partners.

Cloud platform

With the arrival of Satya Nadella, Microsoft took the courageous way of betting for the cloud, at that time Cloud was considered still a Hype and only companies like Salesforce and Amazon were at the forefront of innovation on the cloud. This all changed and since 2010 Microsoft is innovating on the Cloud building its Microsoft Azure with an extense geographical presence allowing customers to have their data near their operations. Even in countries like Germany it is provided a german only cloud with the colaboration of local partners like T-System.

Products like Office 365 and Microsoft Dynamics CRM are a huge success  and the pace of innovation it is quite impresive  but it doesnt mean that Microsoft presents a incoherent set of products because all of then are easily accesible from the main Microsoft Azure platform. Even casual developers can start playing developing web/mobile applications with technologies like Machine Learning and Big Data with only a single account and maybe not to expensive fees after the free account expires. This is specially important because you build a developer community that gives exponential innovation.

SAP cloud offer is based, mainly in the purchases of another companies like Ariba, Concur and Fieldglass. These are great products but it is absolutely not a homogeneous set of products giving confusion to the customer. SAP it is also building their datacenters and providing more an more their business solutions on the cloud which have damaged fees from products on premise but this year they are almost doubling the cloud business from the numbers of last year. As developer you can also start playing a little with HANA on the cloud developing web applications but it still languishes behing the developer community build by Microsoft.

Development experience

The development experience with SAP from the point of view of someone that comes from a world of modern programming languages is a horror story. With the new Microsoft Dynamics Ax you will develop your business code with X++ and some task with C#. Everything from a unique development platform,  Microsoft Visual Studio. Developing in Microsoft Visual Studio is just a pleasure because of the productivity of the tool, their options and the nice look and feel.

Instead developing business applications for the new SAP S/4 on HANA implies to learn ABAP, XSJS, Java, SQLScript HTML5, CSS and Javascript and to use  several development IDES like ABAP Workbench, ABAP Development Tools for Eclipse and SAP WEB IDE. Just learning ABAP is a nightmare and the language itself is a obsolete language that comes from the first versions of SAP. ABAP was at it early stages only a language used for reports and because of the back-compatibility it involved in a procedural language and last in a object oriented language in an artificial way. It first name was already “Allgemeiner Berichts-Aufbereitungs-Prozessor”, which means something like general report processor. There is always a rumour that SAP would like to kill ABAP but it is still there on the last version SAP S/4 because almost all the business logic is builded with ABAP. That forces the developer to learn how to develop with OData in order to build interfaces between the logic, the new database and the new HTML5 based interface. Obviously the developer have not very extra time left to think about developing business code when he has to spend too much time so much languages and interfaces.

When you are a Microsoft Dynamics Ax developer you have inmediately access to all the info you need to be productive. All the database tables and business classes are pretty well documented. Just try to google CustTrans on Google and you will get access to the Microsoft Site on MSDN where the table is descripted. Just try to make the same with SAP, to find out, which table contains the customer transactions and maybe you will spend quite a lot of time just to try to find basic info.

Last about the development of user interfaces in the last version of Microsoft Dynamics you just have to know about one technology, HTML5. And everything could be made with only drag&drop and coding on the events and on the methods for the form object on Visual Studio. There is no need to be a WEB geek in order to develop business applications on the Cloud.

Meanwhile with SAP you still have to learn to develop with the old Dynpro, maybe Web Dynpro, maybe SAP PERSONAS, and last with FIORI. Setting FIORI to start working it is also not a work for novices and require a not so easy configuration. Last, developing for FIORI is it not the easy task that SAP tries to sell you. You will have to go much more on the HTML details but also build the interface with OData between the business code and the ABAP code. That in case the business code is in ABAP, because is the business code is in HANA nobody, even most of the SAP professionals, doesnt know still where the hell would be the business code.

BI platform

Microsoft BI offert for Microsoft Dynamics implies the cool new Microsoft Power BI and the new services from SQL SERVER 2016. That´s it! Just take a look at the new Power BI and you would be impresed. On behind will be powered with an On Memory database that will deviler results in light speed.

On the other side with SAP you will find a myriad of products and offers that makes not easy to understand which one do you have to use. Some of then are more apropiate for the managers, anothers for your department leaders and finally others would be more appropiate for the shop-floor. So you will have to choose between SAP Lumira, SAP BW, SAP Crystal Reports, SAP BusinessObjects Web Intelligence, SAP BusinessObjects Explorer, SAP BusinessObjects Dashboards, ….

This is absolutely not easy but also not cheap!

On Memory database

In the field of On Memory database maybe SAP have the advantage with its HANA database. Now Microsoft with its SQL SERVER 2016 seems to be serious about On Memory and this is the main reason because Microsoft Dynamics Ax would be only available on Cloud until mid next year. Microsoft have to wait until the On Memory technology available on Azure it is available for On Premise systems. It that case I would recomend big companies with million of transactions per day to ask for a realistic perfomance test of both platforms in order to decide. If HANA copes better with such a huge amount of data it could still makes sense to spend so many money in a complicated platform like SAP. In another case, if you dont process millions of transactions per day you are trying to kill a flea with a sledgehammer.


The new Microsoft Dynamics Ax comes with not to many changes on the functionality, that it is important because the functionality that comes from Microsoft Dynamics Ax 2012 R3 it is already proved and not too many changes are needed. That it is a little like SAP that maintain its functionality given in modules like MM, SD, FI, CO, MCM and develop industry solutions based on that.  Allmost all the functionality provided by SAP it is included in Microsoft Dynamics. I can also say that on german if that sounds more professional for you: Lieferbeleg erstellen, Materialen Kommissionieren, Warenausgang buchen, …. everything it is available in Microsoft Dynamics Ax to implement your processes in the most optimal way.

One curious thing about SAP implementations it is that your company it is supposed to work the way that SAP dictates in its implementations. Even on the analysis phases of a SAP project the users are simply presented some SAP powerpoints about the functionality and the users have to present their GAPs. That could be good in cases of companies that doesnt have clear how they have to work or optimize their processes but if you have a clear view of your company and you  want to be ahead of the competition you will need much more. An example of that company is Inditex in Spain (Zara stores). They have build their own system using no standard software because they are a step ahead of the rest and adapting to processes thought by others would make then slower. But that is a radical option that no everybody can implement. Microsoft Dynamics Ax presents a flexible solution that you can adapt and that will grown with you. Obviously making changes to the system without understanding the standard functionality is also not a very good idea.

Availability of professionals and Formation possibilities

It is report that SAP consultants in Germany could make almost 100.000 € and developers even reach the 80.000 € mark. That it is only a hint of how expensive would be your SAP project if you choose SAP. To become a SAP professional almost the only way is to take part on the expensive courses at SAP due to that the system and it myriad of options it is quite innacesible for the casual guy. Even if you work on a End User it would be not very easy for you to learn to use the system without some SAP support. There is some initiatives from SAP like OPEN SAP but most of the courses doenst offer more than just marketing of new technologies.

To become a Microsoft Dynamics Ax Developer you doesnt need too much requirements. It is enought to be a .NET developer with some SQL Skills to make the ladder. Once you have the option to play with a system at an End User or at a Microsoft Partner you can learn quite fast if you have the right colleagues with you. Microsoft provides also on its Customer Source and Partner Source plenty of formation documents and if you wish you can also attend some of the official Microsoft Courses which will speed up your skills on the products.

So I dont see any scarce of Microsoft Dynamics professionals, even SAP consultants could become Microsoft Dynamics consultants quite easily and on the way providing great ideas to their colleagues. In Germany I experience a quite incompetent recruiting process, where companies have some positions open for months of even for years because they look for someone which Microsoft Dynamics experience but also with fluent german, when in most of the cases that it is absolutely not needed.

IT Partners

Microsoft Dynamics Ax is on the market since more than a decade and there is plenty of Microsoft Partners in which you can confide, some of then are specialized specific sectors like retail or industry. Companies like my actual company Alfapeople, MODUS Consult, COSMO Consult, Avanade, HSO, Impuls and SPH are good examples of them, and in Spain Prodware, AxAzure, Iniker IFR and Quonext are examples of high productive spanish Microsoft Partners. Worldwide I can mention companies like Sunrise Technologies, HSO and K3 but also the network of Alfapeople 🙂

SAP Partners there is also a lot out there specialy here in Germany, some of them lying on offices that look like palaces that shows how much money was won with SAP in the last decades. Now it is time for them to prove if they are worth their money and for all the current SAP users to take a look at Microsoft Dynamics Ax. They would be surprised to see how much money they can save and but most important how they will be able to become more agile reacting more quick to the needs of their customers in a more interconnected, cloud oriented world.

Pedro Rodriguez Parra
Dynamics Ax Developer at AlfaPeople GmbH

Microsoft to open UK datacentre

CRM, Home Page, O365, SharePoint, Strategy

The new UK-based datacentre is said to be opening from late 2016, though it sounds like the MoD will begin using it sooner than that.

O365 and Azuze UK data centre coming 2016

At his keynote speech at the Future Decoded event in London yesterday, CEO Satya Nadella stated that  customers in the UK would at last be able to store data within the country, allaying fears (even I not actual legal impediments) around governance and data protection.

In addition to Microsoft Azure and Office 365, the UK datacentre will support Microsoft Dynamics CRM Online sometime afterwards. Microsoft will also offer Azure ExpressRoute to provide customers with the option of a private connection to the cloud.
“At Microsoft, our mission is to empower every person and organisation on the planet to achieve more,” says Nadella. “By expanding our datacentre regions in the UK, Netherlands and Ireland we aim to give local businesses and organisations of all sizes the transformative technology they need to seize new global growth.”

He added that the new local Microsoft cloud regions will enable data residency for customers in the UK, allowing data to be replicated within the UK for backup and recovery, reduced network distance and lower latency.

Nov 11, 2015

Microsoft empowers business transformation

CRM, Home Page, O365, SharePoint, Strategy

“Businesses are hungry to seize new opportunities using technologies like machine learning and predictive analytics,” said Satya Nadella, chief executive officer of Microsoft. “Only when businesses create a culture that empowers everyone to have access to data and insight that drive action will they be positioned to truly transform.”

Nadella demonstrated products and services built by Microsoft to empower industries, organizations and individuals to drive insight and action from their data.

“Microsoft Azure IoT services combined with Windows 10 IoT for devices and Power BI is fueling a degree of collaboration, visibility and insight from data unheard of in the oil and gas industry — from the oil field and operations center all the way to the boardroom,” said Gary Pearsons, vice president and general manager, Customer Support and Maintenance at Rockwell Automation.

There were several components of today’s announcements:

Tools for industries

  • Microsoft Azure IoT Suite. Microsoft Azure IoT Suite is an integrated offering that takes advantage of all the relevant Azure capabilities, simplified billing and easy provisioning to help businesses connect, manage and analyze all of their “things.” Available in preview later this year, this new offering will provide businesses with finished applications targeting common Internet of Things (IoT) scenarios — such as remote monitoring, asset management and predictive maintenance — to simplify deployment and provide the ability to scale their solution to millions of “things” over time. Azure Stream Analytics will be generally available next month as part of Azure IoT or as a standalone service. Currently in preview, Azure Stream Analytics helps customers process massive amounts of real-time, incoming data from “things” and services so customers can predict trends and automate service and responses.
  • Windows 10 for Internet of Things. Microsoft announced that Windows 10 will provide versions of Windows for a diverse set of IoT devices, under the Windows 10 IoT moniker. Windows 10 IoT will offer one Windows platform with universal applications and driver models that will span a wide range of devices, from low-footprint controllers such as IoT gateways to powerful devices such as ATMs and industrial robotics. Windows 10 IoT will also bring enterprise-grade security from the device to the cloud and native connectivity for machine-to-machine and machine-to-cloud scenarios with Azure IoT services.

Tools for organizations

  • Power BI is a service, now available in the U.S. and more than 140 markets around the world, that helps customers take the pulse of their business via live market operational dashboards, explore data through interactive visual reports, and easily share new insights with colleagues and customers. New Power BI connectors, dashboards and reports for some of the industry’s most popular data sources — including Google Analytics, Microsoft Dynamics Marketing, Zuora, Acumatica and Twilio — will be available soon.
  • The Spring ’15 release for Microsoft Dynamics CRM, expected by the end of the second quarter of 2015, will deliver significant performance enhancements, deepen interoperability with Office 365, and with new knowledge management enhancements, improve efficiency and collaboration between workers and businesses. The release also introduces Microsoft Social Engagement, the latest update to Microsoft’s social monitoring tool designed to enable people to monitor and engage in the context of their Dynamics CRM and/or Office application. The intelligence gained from this new solution will enable businesses to be better informed about what customers are saying across all social channels.

Tools for individuals

  • Office Delve, now globally available, uses sophisticated machine learning techniques to help people discover relevant documents, conversations and connections from across Office 365. In addition, Exchange Online and Yammer content is now accessible via the Delve experience.
  • The company announced the IT Professional and Developer Preview of Office 2016, a key milestone for the next version of Office on the Windows desktop. Office 2016 is expected to be generally available in the second half of this year. Microsoft encourages IT pros and developers from its enterprise customers to join the preview to prepare, begin testing and help shape the future of the product.
  • Skype for Business (previously Microsoft Lync) technical preview starts Monday, and the new Skype for Business client, server and service within Office 365 will be available starting in April. Skype for Business delivers an enterprise-ready voice and video collaboration experience based on the familiar Skype user interface, including the ability for Skype for Business customers to connect with anyone in the Skype network.

Customers and partners also highlighted the benefits of these products and solutions and demonstrated how they are taking the next steps to use systems of intelligence to transform their businesses.

VMware hails hybrid cloud as answer to enterprise Safe Harbour

Home Page, Strategy

VMware chief claims the roll-out of its hybrid cloud network should help ease enterprise concerns about off-premise data protection
VMware claims the outcome of the EU Safe Harbour ruling will cause minimal disruption to its operations and should serve to reinforce its hybrid cloud strategy.

The virtualisation giant was among the first to predict that the hybrid cloud model would emerge as the enterprise’s preferred model of consuming IT some years ago, while others backed the public or private cloud. In keeping with this, VMware rolled out its vCloud Air platform in 2013, which is designed to make it easier for users to move workloads between their on-premise and off-premise environments, and the firm has opened numerous datacentres worldwide to support it.

Users also have the alternative option to procure cloud services through service providers signed up to the vCloud Air Network programme, rather than from VMware directly.

Speaking at the VMworld Europe User conference in Barcelona, the company said this setup should enable the company to sidestep some of the challenges the disappearance of the Safe Harbour agreement could throw up.

This is because the firm’s vCloud Air Network of partners means European users of its cloud platform may not have to send data back to the US as there are already providers local to them that can meet their needs.

During a press Q&A at the event, Bill Fathers, executive vice-president and general manager of cloud services at VMware, said the firm has taken steps to adjust the wording of its contracts in the wake of the Safe Harbour ruling on 6 October 2015.

“We are modifying the language we’re using with our suppliers so we can give our clients more assurances around where their data will reside,” he said.

In the longer term, having the vCloud Air Network already in place should stand it in good stead, as the rest of the US cloud supplier community works out how best to respond to the news.

“Having a network of service providers in all of the geographies we operate is likely to be the winning strategy,” said Fathers.

“Rather than be a single, homogenous US entity that tries to provide that service across a region such as Europe, we think having hundreds of thousands of services providers who can provide absolute assurances [to customers about where their data resides] makes sense.”

The Safe Harbour fallout

As previously reported by Computer Weekly, the European Court of Justice ruled on 6 October 2015 that the Safe Harbour data-sharing agreement should be considered invalid because it fails to “adequately” protect European users’ data during its transit to the US.

With more than 3,000 companies signed up to the agreement, concerns about the disruption this will cause to the way US firms operate have been aired by technology suppliers and the legal community.

VMware is not the only cloud company to react bullishly to the Safe Harbour announcement. Andy Jassy, senior vice-president of Amazon Web Services (AWS), has been equally quick to shoot down suggestions about the ruling’s impact on its European operations.

During a press Q&A at the 2015 AWS Re:Invent user conference in Las Vegas, he said: “We have a number of ways for our customers to move their data outside of the EU to AWS beyond Safe Harbour.

“As the European data protection agency has approved the AWS data protection agreement via the Article 29 Working Party, it really has no impact on our customers.”

However, Richard Munro, chief technology officer of vCloud Air for Europe, the Middle-East and Africa at VMware, told Computer Weekly at VMworld that US cloud firms should be wary of rushing to make declarations about their Safe Harbour stance without doing some deep due diligence first.

“With our service, your data will not move outside a country’s borders unless you want it to, which is part of our differentiation,” he said.

With the final draft of the EU General Data Protection Regulations due to drop before the end of 2015, and the uncertainty over the UK’s future in the European Union (EU), Munro said both these factors could compel more users to consider going down the hybrid cloud route over time.

“We see people use public cloud services and I think they’re acknowledging they’ve lost quite a degree of control to take up those services,” he said.

“When you get something such as Safe Harbour or the EU Data Protection changes coming from nowhere, it’s difficult for a company to assess what they should do as a business entity if they’re not in control of where their data is residing or what controls are around it.

“As more of these things crop up, we’ll see more businesses wish they had control of where there data is – even when it’s in the public cloud,” he said.

4 Ways IT Can Drive Innovation

Home Page, Strategy

CIO Priorities In today’s rapidly changing business environment, every industry—healthcare, education, retail, transportation, agriculture, government services, and more—is transforming and experiencing disruption at some, if not all, levels. Now more than ever, business leaders are looking for the tools that will help them navigate these changes to drive innovation and competitive differentiation. And since you and your IT leadership team are closest to the technology, they’re looking to you for answers.

Help us, FAST! Business leaders are calling on you and your IT organization to supercharge innovation by giving them IT solutions quickly. They want responsiveness from your organization, as a true partner and team. If you can’t partner with them on a solution, they’ll build it or buy it themselves, using shadow IT. In fact, according to a Harvey Nash report, more than 25 percent of overall IT spend is controlled by or managed outside the IT organization for 19 percent of CIOs surveyed.

[1] The business wants choice. They want A, and if they can’t have A, they want you to provide them with an equally desirable option B or C or D. They control budget—IDC states that line-of-business (LOB) buyers now fund 61 percent of IT projects.

[2] And they’ll buy what they need if they don’t think you’re offering a good option. Business users want their IT solutions to be simple and secure. They expect technology to stay out of the way of work. People don’t want to care about infrastructure when they go about their daily business, just as they don’t want to care about the nanometer process or the A8 processor when they use an iPhone. They’d like to simply pick up a device, use their applications, and get to their data. And they count on IT to make this happen for them.

Your True Mission

You know more about technology than anyone else on the CEO’s staff. Increasingly, you’re trying to apply your business and technology acumen to monetize IT assets, drive innovation, and create value throughout the enterprise. However, over time, your IT infrastructure has gotten so complicated that the CIO has inadvertently become the Chief Infrastructure Officer. And this makes it difficult to focus on what you know the business needs most: that is, to turn data into actionable information that can be acted on as effectively and efficiently as possible. What gets in the way of your efforts to enable the business? Let’s start with the appetite for apps and an easy, consumer-like IT experience that just works.

End users—both customers and employees—are consuming apps and data that add tremendous complexity to your IT environment. There are potentially thousands of business apps that you must install, monitor, manage, archive, restore regularly, test, and plan for disaster recovery. Complexity also comes from changing organizational structures and procurement processes; an increasing number of silos; intricate chargeback models; and the need to maintain existing investments in infrastructure and IT services. You’re dealing with these challenges, but the truth is, you know the business equals the technology. And the CIO’s real job is delivering those applications and services that make the business real. So in today’s world, you need to be the Chief Innovation Officer, clearing the way for technologies that will change the game for your business.

The Passage Through Every year, VMware meets hundreds of CIOs dealing with these same challenges. Many of them are thriving by embracing the promise of software-defined IT solutions. These types of solutions virtualize compute, networking, and storage and make them available to you in the cloud.

How? The power of a software-defined approach is what you might call the economics of simplicity, or cloud economics. You can realize the benefits of cloud economics in four key ways:

  1. Virtualization – Separating applications from the underlying infrastructure (compute, storage, and networking) allows you to keep the entire network-distributed application together in a segment. As a result, you can move and deploy it without having to tinker with its internal configuration from a networking perspective.
  2. Simplified management – With applications in virtual segments—whether that segment is a virtual machine, a virtual network, or a virtual storage type—you radically simplify the number of different types of things that must be managed in your data center.
  3. Automation – Instead of needing to automate the life of 3, 4, or 5,000 different applications, you need to automate the lifecycle of only a few segment types. And by automating the lifecycle of an individual segment, you can automate the lifecycle of all the applications you put into that segment.
  4. Choice – Once the application—and all that goes with it—is separate from the infrastructure, you can choose to run it in the cloud. That cloud doesn’t have to be a private cloud or a public cloud. You can deploy a hybrid cloud for flexibility and manage it as a single, unified cloud, using the same people, processes, and tool sets.

Why is the software-defined approach so compelling? Because the simplification it enables gets you closer to what you want IT to be: self-service, instantly provisioned, pay per use, elastic, and cost-efficient. Furthermore, it frees up IT budget, allowing you to redirect budget to innovation—innovation that drives your business’s top-line growth. Mission accomplished. Check out the article “Mobile-Cloud Technology Speeds Innovation” to discover how companies are leveraging cloud economics. – See more at:

CEOs want CIOs to stop using jargon and focus on business needs

Home Page, Projects, Strategy

Public sector CIOs shouldn’t underestimate the IT literacy of CEOs and need to focus on collaborative working, the organisation’s use of data and solving business issues.

At a recent workshop, around 20 local government CEOs called on CIOs to “reshape their teams” to suit the future needs of the organisation, said a report by not-for-profit organisation Eduserv.

“Sometimes dealing with IT feels like heavy lifting all the time, trying to get behind and beyond the ‘tech speak’,” the report said, quoting one chief executive officer.

“Their frustration is with claims coming from IT – either suppliers or in-house teams – that they can ‘enable digital services’ or ‘deliver transformation’, without specific examples of real business issues solved by technology, with measurable outcomes relevant to the challenges they face,” it added.

Jos Creese, prinicipal analyst at Eduserv, chaired the workshop and said CEOs believe technology can help transform councils and enable efficiency.

“It is logical that CIOs should play a role in helping organisations realise these gains. The opportunity for heads of IT and CIOs is to step up to these strategic challenges, seizing the chance to assist CEOs in driving and reshaping their organisations into the future,” Creese said.

IT jargon misses business targets

However, public sector chief executives want CIOs to “stop talking about IT and focus on solving business issues”.

“CEOs relayed a feeling that at least some IT professionals are still not in tune with real business needs and pressures, and are still too focused on clever technologies, rather than what it can do to transform service delivery,” the report said. It added that poor alignment of IT activity and business priorities remains an issue.

One of the main frustrations outlined in the report is the lack of data on how people access and use public services.

“While acknowledging the limitations of legacy systems, CEOs could not understand why IT seems to find it so hard to unlock data for wider re-use and provide better customer insight,” the report said.

Local authorities face constraints around budget and culture to change, and CEOs recognise that leading IT “is a tough role at a very tough time”.

In dealing with internal barriers, one CEO said IT teams must think about how they can work across different parts of the organisation to “reduce resistance to change”.

The report highlighted the need for collaboration and use of national frameworks across councils, urging CIOs to implement IT systems without proprietary lock-in, to link up with others.

Other requirements from CEOs include: Solving legacy IT problems; stop buying IT that won’t benefit the councils’ needs; and “doing away with Victorian ways of working” in IT teams.

Why it’s time to STOP “Adding Value”

CRM, Projects, Strategy

It’s probably the most commonly proposed response to price pressures and commoditisation: if we’re not prepared to cut our prices, we had better add more value for the customer. It’s a reasonable objective, but the sad truth is that most so-called “value-added” strategies simply add cost and complexity without making the offering any more desirable to the customer. In fact, they often have the opposite effect. Continue reading “Why it’s time to STOP “Adding Value””

Safe Harbour invalidated by EU Court of Justice

Home Page, Strategy
The European Court of Justice (ECJ) has ruled that the Safe Harbour framework is invalid, but what does that mean for business?
According to the European Parliament, more than 3,000 companies currently use the framework for the transfer of data, including firms such as Facebook, Google and Microsoft. The Safe Harbour framework, administered by the US Department of Commerce, enabled US companies to self-certify that they have certain standards for the protection of personal data in place. However, the ECJ said the Safe Harbour framework is invalid as a mechanism to legitimise transfers of personal data from the EU to the US because it does not guarantee adequate data protection.
Many agree that the ruling has far-reaching implications for all businesses, particularly social media networks and other technology businesses that hold or process personal data of EU citizens in the US. While some lawyers believe the ruling will not cause any major disruptions, thousands of companies using Safe Harbour will have to review their data transfer processes.
“The judgement means businesses that use Safe Harbour will need to review how they ensure data transferred to the US is transferred in line with the law,” said David Smith, deputy commissioner at the Information Commissioner’s Office (ICO).
The ICO recognises it will take some time for businesses to do this, he said, noting Safe Harbour is not the only basis on which transfers of personal data to the US can be made. “Many transfers take place based on different provisions. The ICO has previously published guidance on the full range of options available to businesses to ensure they are complying with the law related to international transfers,” said Smith. However, the ICO will work with other data protection authorities in Europe and issue further guidance for businesses on the options open to them, according to Smith.
“The ruling does not mean there is an increase in the threat to people’s personal data, but it does make clear the important obligation on organisations to protect people’s data when it leaves the UK,” he said.
Disappearance of Safe Harbour
Christopher Jeffery, head of UK IT, telecoms and competition at international law firm Taylor Wessing, said the ECJ ruling forces US companies that need to take personal data from the EU down other compliance routes. “There are alternatives to Safe Harbour, but for most companies they take time and money to put in place. That will be an unwelcome distraction – no one was preparing for the abrupt disappearance of Safe Harbour,” he said. Although some commentators have raised the prospect of mass enforcement action against every US company signed up to Safe Harbour, Jeffery believes this is unlikely. “We expect the more pragmatic regulators to allow companies time to re-organise their compliance programmes,” he said.
However, Jeffery said in countries such as Germany – where Safe Harbour has long been regarded with suspicion – the regulators may not be so generous.
“They may feel concerns about Safe Harbour have been well-flagged and so businesses should have made alternative arrangements by now,” he said.
According to Jeffery, the key message to businesses is to “get on it” immediately.
“Getting model clauses signed, for instance, between affiliates and with key external suppliers should be relatively straightforward and helpful to show they are taking the issue seriously – go for the low-hanging fruit early to show a desire to move towards fuller compliance. Organisations that are slow to react and are seen to be doing nothing risk attracting regulator attention – and that will likely not end well,” he said.
Deema Freij, deputy general counsel and global privacy officer at Intralinks, said any company using Safe Harbour will need to evaluate how it protects personal data, as well as re-evaluate governance, risk and compliance processes to meet international data transfer requirements to the US without Safe Harbour being part of the mix.
“In anticipation of this ruling and because of the criticism Safe Harbour has received in recent times, many companies have already begun using model contracts as a means of meeting international data transfer requirements,” she said.
Alternatives could be scrutinised
However, Marc Dautlich, information law partner at legal firm Pinsent Masons, warned that while companies are able to adopt model clauses or implement binding corporate rules (BCRs) to help them meet the adequacy standards of EU data protection laws when transferring personal data outside of the EU, both options could now come in for scrutiny for similar reasons to those highlighted in relation to the Safe Harbour agreement.
Mahisha Rupan, data protection and privacy senior associate at technology law firm Kemp Little, also noted that BCRs only work for intra-group data transfers.
“Model clauses will need to be put in place between each data exporter and each data importer, which may be prove to be impractical where a US company has thousands of EU-based customers,” she said.
Consent of the individual may also be used to justify certain transfers to the US, said Rupan. “But consent is tricky as it must be specific, informed and freely given,” she added.
Robert Lands, partner and head of intellectual property at law firm Howard Kennedy, said the ruling means extra due diligence into service providers will need to be conducted, as many companies outsource their human resources, payroll and other tasks involving personal data about customers or staff.
“European businesses using software supported from the US need to be wary. Remote access can often allow a technician to view personal data in the US, meaning a transfer of personal data can occur. A more transparent and accessible approach should be taken to data sharing,” said Lands.
“Obtaining explicit consent to justify transfers and creating new agreements between companies that share data may be further ways of meeting the requirements of the Data Protection Directive,”  he said.
Bharat Mistry, cyber security consultant at Trend Micro, said US companies will have to look at local operations to process data.
“This is a good thing as it restricts data flow to within the EU or local country borders, therefore resulting in tighter control and enforcement by the EU and additional investment into Europe in the form of extra jobs in data processing,” he said.
Businesses affected outside the EU
In terms of the impact on businesses, Mistry said it will be niche startups or companies outside of EU borders – where the EU deems the data protection controls/laws do not meet its standards – that will suffer most.
“The large-scale social media companies with a presence inside EU borders will be able to access the data. Overall, the ruling is positive as the more distributed the data, the higher the chance of a breach,” he said.
Ashley Winton, UK head of data protection and privacy at international law firm Paul Hastings, said companies should also be mindful of another recent landmark case against Slovakia-based property website Weltimmo.
The ECJ’s ruling on 1 October 2015 is also expected to have far-reaching implications for tech giants processing data in Europe.
According to the ruling in favour of the Hungarian data protection authority, companies that have websites translated into another language – targeting consumers of European member states – may now have to comply with the regulations in each individual member state.
“Multinational companies that have elected to create an establishment in a more business-friendly jurisdiction are now likely to have their data protection practices scrutinised by local regulators across the EU,” said Winton.
“There are currently no rules limiting individuals bringing complaints regarding data protection across multiple jurisdictions simultaneously, so we may now see these complaints springing up from every direction,” he said.
Antony Walker, deputy CEO at techUK, said the ruling will cause real confusion and uncertainty for all sorts of businesses that need to transfer data between the EU and US.
“Businesses will be looking to the European Commission and national data protection authorities to steady the ship and provide clarity on what they need to do to ensure their transatlantic data transfers are lawful,” said Walker.
“This is a big issue for many small businesses as they will be faced with the time-consuming and costly task of working through the full legal implications. The ability to transfer data lawfully across borders is fundamental for a growing and dynamic digital economy. Businesses need stability and certainty in the legal framework to enable this to happen,” he said.
European Commission must make data transfers safer
Following the ECJ’s latest ruling, Claude Moraes, chair of the European Parliament’s Civil Liberties Committee, called for the immediate suspension of the Safe Harbour agreement and the initiation of a secure data protection framework that will guarantee the rights and privacy of European citizens.
“Compared with the strong, enforceable data protection legislation in the EU, Safe Harbour offers completely inadequate protection for EU citizens using services from US companies,” said Moraes.
“The Snowden disclosures threw these inadequacies into the spotlight as Safe Harbour does not provide any protection from mass surveillance activities because it contains a national security exemption that has never been clarified,” he said.
However, Moraes said there were also concerns prior to the Snowden revelations given that Safe Harbour is a non-binding agreement that lacks compliance by companies and gives no possibility for citizens to enforce their rights.
The decision by the European Court of Justice to declare the Safe Harbour agreement invalid forces the European Commission (EC) to act to ensure transatlantic transfers of personal data of EU citizens to companies in the US offer the continuity of protection required by EU law, according to Moraes. It also means the EC will have to come up with an immediate alternative to Safe Harbour, he said.
“The Commission has been in negotiations with the US for more than a year on improving the framework, but we have still received no update on these discussions,” said Moraes.
He called on the EC to put forward a complete and strong framework immediately for transfers of personal data to the US that complies with requirements of EU law as enshrined in the Charter of Fundamental Rights and EU data protection rules. The framework should also provide EU citizens with solid, enforceable data protection rights and effective independent supervision.
Responding to ruling the EC said it was an important step towards upholding Europeans’ fundamental rights to data protection.
“I see this as a confirmation of the European Commission’s approach for the renegotiation of the Safe Harbour agreement,” said EC first vice-president Frans Timmermans.
“We have already been working with the American authorities to make data transfers safer for European citizens. In light of the ruling, we will continue this work towards a renewed and safe framework for the transfer of personal data across the Atlantic,” he said.
In the meantime, Timmermans said transatlantic data flows between companies can continue using other mechanisms for international transfers of personal data available under EU data protection law.
He also promised “clear guidance” for national data protection authorities on how to deal with data transfer requests to the US.
While this ruling is widely considered to be significant, few believe the day-to-day operations of most companies will change significantly, particularly in light of the EC’s statement.
However, Austrian privacy activist Max Schrems – who brought the initial case against Facebook that was referred to the ECJ and resulted in the ruling on the Safe Harbour agreement – said US companies that aided US mass surveillance, such as Apple, Google, Facebook, Microsoft and Yahoo, may face serious legal consequences from this ruling when data protection authorities of 28 member states review their co-operation with US spy agencies.

ExpressRoute for Office 365

CRM, Home Page, O365, Projects, Strategy

Announcing general availability of ExpressRoute for Office 365

Today we’re pleased to announce that Azure ExpressRoute for Office 365 is now generally available from these network operators:

  • British Telecom
  • Equinix
  • Tata Communications
  • TeleCity Group
  • Verizon

You can read about how Microsoft is using ExpressRoute for Office 365 in the Microsoft IT whitepaper, “Optimizing network performance for Microsoft Office 365.”

Connecting your network to Office 365 using Azure ExpressRoute

Depending on your network configuration, here’s how you can work with network operators offering ExpressRoute for Office 365 to establish a connection between your network and Office 365:

  • If your organization already uses Azure ExpressRoute, your network operator can simply turn on the connectivity for you. Since use of Office 365 generates additional network traffic, you should discuss the requirements for additional bandwidth with your network operator.
  • Organizations using IP VPN technology for a WAN provided by a network operator can ask the network operator to add Office 365 as a node on your WAN. Once Office 365 connectivity is added, Office 365 services appear as if they are on your WAN—like an offsite datacenter.
  • ExpressRoute can also support large or point-to-point network connections. If you have a large broad network, then you may already have a network connection in a co-location facility where Azure ExpressRoute is available. You should work with your network provider to identify the best way to connect to Azure ExpressRoute.


Frequently asked questions

Q. Where in the world is ExpressRoute for Office 365 available?

A. Your users can connect from anywhere in the world that your network operator provides access. Each network operator connects to the Microsoft network at specific locations. They can provide networking from the user location to the Microsoft network connection. You should discuss the options with your network operator. The locations where they will connect to Microsoft’s network are listed here.

Q. How do organizations purchase ExpressRoute for Office 365?

A. Organizations interested in purchasing ExpressRoute for Office 365 should have an Azure subscription and should discuss details of the connectivity with an Azure ExpressRoute partner.

Q. Are there any Office 365 services that Azure ExpressRoute cannot provide a connection to?

A. Today connectivity is available for Exchange Online, SharePoint Online, OneDrive for Business, Skype for Business Online, Azure Active Directory, Office 365 Video, Power BI, Delve and Project Online. Services that ExpressRoute does not provide connectivity to include download of Office 365 ProPlus installation files, Yammer, Domain Name Service and Content Delivery Network servers.

Q. Is QoS supported on Azure ExpressRoute?

A. Yes. QoS is supported for Skype for Business Online over Azure ExpressRoute for Office 365.

Q. Does Microsoft provide tools to test for network performance issues?

A. Yes. We have the Office Client Performance Analyzer (OCPA), which was recently updated to add a number of new performance test metrics. Azure ExpressRoute for Office 365 may be a solution to network performance problems experienced by users. OCPA can be downloaded from the Office 365 admin console here.

For customers with Premier support contracts, Microsoft has a service offering for Office 365 Network Performance Assessment. Please contact your Technical Account Manager for details.

, on

How SAML is used for Single Sign-On (SSO)

Home Page, O365, Strategy

SAMLWithin SAML, there are profiles that define how assertions, protocols and bindings are combined to satisfy a particular use case. Think of a SAML profile as a template, each profile uses different combination of bindings, protocols and assertions. One of the most used SAML profiles is the Web Browser SSO Profile.

The SAML Web SSO Profile provides the ability for users to access multiple applications with a single set of credentials entered once. This is the foundation of federation and also of single sign-on (SSO). Using SAML, users can seamlessly access multiple applications, allowing them to conduct business faster and more efficiently.

You may not have realized this, but you use SAML SSO every day. Whether it’s logging into your bank online, using a mobile application, or pretty much anywhere you are signing into a website and accessing the information therein. For the purposes of explaining how SSO works, let’s use online banking as our use case. When a bank customer logs in to their bank account via the bank’s website they may need to access a variety of applications from their checking and savings accounts to their credit card balance. Each of their accounts types (savings, checking, credit, brokerage, business) are often provided by different back-end applications. These applications need to be able to communicate with each other using a common authentication scheme to provide a seamless user experience that enables one login to provide access to all the parts of the online banking web portal. SAML provides the means to accomplish this.

White Paper

How to Implement Enterprise SAML SSO

Let’s look closer at the sequence of steps to generate a SAML token, and then use it to gain access to an application or resource. The figure below shows the basic steps necessary for SSO using SAML.


  1. User authenticates to identity provider using a single-factor, or multi-factor authentication.
  2. The Identity Provider issues a SAML token to the User with assertions about the User’s identity. In Mobile devices, and web browsers, the SAML is often issued as embedded BASE64 within the HTML response.
  3. The User’s browser is redirected from the Identity Provider to the location of the Service Provider. The User’s browser then issues a request to the Service Provider with the SAML token embedded. The Service Provider then inspects the SAML token and its contents to determine validity based on the trust relationship with the Identity Provider. The Service Provider then provides access to the various online banking applications based on the SAML assertion statements included in the token.

SAML SSO provides a seamless experience for the user to access multiple applications without the user or client technology requiring any changes to support the SAML exchange.

To learn more about SAML SSO, download our latest white paper: How to Implement Enterprise SAML SSO

By | Date posted: October 10, 2014