Yolanda Smit, Strategic BI Manager at PBT Group speaks about the difference between big data vs normal data and highlights the ever important question companies should be asking, “do you have a problem that big data can solve?”

Business Intelligence, or BI, is increasingly a must-have technology for progressive businesses. But how does it stack up with the instincts of business leaders who should be taking the BI reigns, and why can’t you expect results tomorrow?

There’s a lot of jargon encouraging a business based on intelligence. Marketing hype often talks of the ‘fact-based’ or ‘real-time’ business. But it’s disingenuous, since all good companies operate on intelligence. The arrival of technological products that handle that intelligence is not solving a particular problem as much as enhancing a pre-existing role.

Yet this can lead to the suspicion that BI aims to replace the instincts of business leaders, usurping their guidance. It has bred hostility to BI, misinterpreting the value BI solutions can bring to an organisation, says Riona Naidu, head of Consulting Services & Marketing at Knowledge Factory.

“BI shouldn’t be an uncomfortable thing. It shouldn’t be a big move. It should be something that works alongside what the business is actually doing, just making it a bit more efficient and to the point.”

BI should complement the culture of an organisation, not upset it. CEOs aren’t exclusively fact-based and technology can’t replace their intuition. If done right, BI serves to inform a leader’s decisions or creates an opportunity for them to test their hypothesis. Yet a successful approach to BI requires leaders not to be blind to the promises of modern intelligence gathering. Real progress requires vision. Says Juan Thomas, Solutions director of PBT Group: “Companies that are really getting value from their investments are the ones that keep investing in newer and better technologies. We have the capability to do it, but do we have buy-in from a programme manager or the C-level?”

Executive buy-in

The question of business buy-in quickly surfaces. Good BI falls squarely on the culture pushed by the CEO, says Bill Hoggarth, an independent consultant with over 30 years of BI experience. “If you look at most of the changes in the history of BI and analytics, they’ve been driven by individuals. Sam Walton was the first to base his business on data- and fact-led decisions. When he only had two stores in Washington state, he already had invested in a nascent mainframe. He made information part of his business strategy and it made Walmart what it is today.”

Walmart is often cited as a wholly technology-driven company. It has long embraced technological developments in supply chain management to keep its pricing both competitive and profitable. Most recently, the company merged its various IT divisions into Walmart Technology, a single monolithic entity that surveys all of the retail giant’s technology plans from close to the business’ pulsing heart. It is also a trailblazer of BI adoption.

You have to understand the business context and then work back to what data is required to deliver the intelligence that will enable you to make the good decisions.
– Steven Ing, BSG

Any company can achieve such an edge, but it doesn’t happen overnight, despite the promises of some BI solutions. More on that later, but the first and fundamental step to any BI journey starts, according to Steven Ing, associate consultant at BSG, from the business value proposition: “What decisions do you need to make to enable that proposition? You have to understand the business context and then work back to what data is required to deliver the intelligence that will enable you to make the good decisions. The reason why BI has failed is that a lot of these projects are started the other way around. You’re not really understanding what decisions you’re trying to affect and, therefore, what intelligence is required for that.”

Expensive mistakes

But can the onus really be put on business to appreciate the cat’s cradle of technicality present under BI’s hood?

“There’s always this clash between business and IT,” says Tony Bell, director at Decision Inc. “What’s happened is that the business people have decided they have performance issues that require immediate solutions in reporting and analysis. So their first capability is to make better decisions. They don’t care where the data comes from. They just want results to see and improve. They want that value. If you can show the value and improvement in business, business is entirely happy and that expands from one department to another.”

But not everyone agrees with this view. The best BI successes, says PBT Group’s Thomas, tend to come from leaders who take an interest in the technology side of things: “Most of the success stories I know of are where your executives are really interested in that technical landscape. I think that’s where, once you understand why we do what we do, there is immediate buy-in. We shouldn’t underestimate executives’ understanding. If you’re throwing R30 million at a BI shop for five years, you will start asking questions.”

Gary Allemann, MD at Master Data Management, combines the two views, saying it’s important for leaders to have their minds on both the business outcomes and the technology behind it. Otherwise the eagerness for BI can lead to some expensive mistakes.

So the idea that IT is the central custodian and guardian of all data in an organisation…I don’t buy that.
– Bill Hoggarth, BI consultant

“One client we’re working with had American consultants come in and build a new segmentation model for them, but it was built upon data sets that they didn’t have in their business,” says Allemann. “So, if you don’t have the data, the model is useless. But at the same time, the concept of saying, ‘I’m building these reports and models because I’m trying to achieve better segmentation to achieve a business goal’ is absolutely correct.”

Buck-passing

Executive buy-in and the vision of a fact-driven company are important steps towards BI nirvana, but the process still faces a significant barrier: who owns it?

“There are no barriers preventing companies from doing anything with BI,” says Ing. “Only people prevent this from happening. The C-level have heard of these concepts, like being fact-based, doing analytics, etc. But they still throw the ball over the wall to IT. Come back and show us something. They’re not taking ownership and changing that culture in the organisation.”

But passing the buck to IT is almost reflexive. BI relies on data and data represents all the information in an enterprise. The management of that information quickly becomes an issue of governance as part of the King legislation framework. Since governance is often managed through technology solutions, says Chris O’Connell, MD of BITanium Consulting, companies habitually kick the BI ball to IT.

“King puts (information governance) responsibility firmly on the company’s plate. My feeling is IT is the proxy responsible for that key responsibility,” he says, leading them to become automatic heirs to a technology-based regime such as BI.

The problem is that if you don’t understand the data and you don’t understand the model, how do you know if that model is correct for that data?
– Matthew Cook, Datacentrix

That, though, soon devolves into buck-passing, which is problematic for most technology projects, but an outright death-knell for BI. Allemann appreciates the business habit of trying to be hands-off, but in a BI context, cautions against it: “One of the challenges we have when we throw that ball over the wall to IT is that it’s very easy to turn to IT and say, ‘You’re not getting us the solution we want’. But business needs to engage until they get the answers they want. We shouldn’t be picking technology until we understand the problem.”

Leaving most of that process to IT is dangerous, says Hoggarth. IT rarely has access to most of the data that will feed a BI solution, something that even a thorough interview process can’t overcome.

“Most customer data in SA today does not reside in the IT realm,” he says. “It sits on a Salesforce cloud or Microsoft Azure cloud somewhere. IT can’t manage that. It has no say over that. They often don’t even know their marketing or sales teams have put data in the cloud. So the idea that IT is the central custodian and guardian of all data in an organisation…I don’t buy that.”

Then what role can IT play? Says O’Connell: “I think its role becomes putting the guard rails in place, to make sure that business doesn’t hurt itself.”

Matthew Cook, Business Development manager at Datacentrix, agrees, noting that often business moves faster than IT. Even though the responsibility to find a solution is passed on to the technologists, users soon grab at the reigns again.

“The process takes time, and in the meantime, business gets impatient and buys something, because it needs an answer now,” says Cook. “Should IT provide the guardrails? Absolutely: how do we (service providers) support IT in supporting business, but at the pace that business wants it done at?”

Allemann is not convinced a sandbox for business should be on IT, not in terms of BI: “Business needs to be defining what those boundaries are. We trust our financial data, but we don’t give it to IT and say, ‘It was your responsibility, so you sign off on these reports’. It’s signed off by the accountants, the auditors. So when we’re looking at marketing data, or sales data, or inventory, it’s not IT’s problem to make those reports accurate. It becomes their problem, because they have a role to play. But business needs to engage right from the start.”

Quick wins vs long-term strategy

A central theme starts to emerge from the conversation: embracing BI is neither simple nor iterative. It requires a lot of upfront hand-wringing and decisions that need to be carried through by all, as well as an understanding by the company leadership of its various moving parts. A BI solution can’t simply be delegated, then judged by the results.

“The problem is that if you don’t understand the data and you don’t understand the model, how do you know if that model is correct for that data?” asks Cook. “That’s the life cycle you need to go through: to get a better understanding of your data from a context perspective. Get an understanding of what it is you’re hoping to achieve from a measurement point of view and then marry the two together.”

Does this mean there are no quick wins in a BI environment? That may well be the case. If anything, BI needs to be pre-empted by an overarching plan: a framework that stops BI’s organic growth from overwhelming everything. Just consider how many reports a company generates and how often the nature of those reports changes as people shift positions. The result is information overload and the death of clarity. Losing control over BI is always a risk, hence the need of a steady hand from the start.

Yet Naidu says there is still a role for some quick wins in BI: “It’s part of the culture change. You walk into any boardroom and you have the pro-data crowd that wants the BI systems right away. Then you have the pro gut-feel and then some people who are in-between the two. Quick wins is one way to unify everybody and prove the case as they go on.”

But do not confuse ‘quick win’ with ‘quick fix’, cautions Thomas: “Some companies are stingy and hoping the small solution they use with self-service BI will solve their challenges and give an instant advantage. That’s not the case. You have to crawl first, then start walking. The guys who are running are the ones seeing the real advantage.”

Hoggarth, though, feels optimistic that the modern innovation around BI is making it more feasible to get insight faster, because it’s no longer restricted to internal data.

“We have the tools, but often the raw ingredients for those wins don’t exist within the company itself,” he says. “They are out there somewhere. They want to know what competitors are doing and what customers think. BI hasn’t been able to do that. It can now and that’s the inflection point. This is a very exciting time for BI.”

Credit : IT Web

Before adopting a cloud business intelligence solution, a company must delineate its approach to the cloud BI concept.

By , Solution Architect Manager at PBT Group.
Johannesburg, 2 Jun 2016


Business intelligence (BI) in the cloud is a hot topic within many companies. As with all new concepts, there is the inevitable level of confusion and uncertainty regarding how to proceed. To demystify the topic, I believe things need to go back to the basics.

For many companies wanting to understand BI and their cloud strategy, the first area of focus would be on actually defining a viable ‘cloud business intelligence strategy’. Only once the strategy has been defined does it make sense to evaluate the product(s) and associated vendor(s) that can support this strategy – and from here, to define the critical success factors for successful cloud BI adoption.

Only when all of this is accomplished is the company ready to start planning the physical implementation. This is a lot to digest in one Industry Insight, so let me start with what to consider when defining a successful cloud BI strategy.

Never-ending story

What makes cloud BI a complex and confusing topic? Well, to my mind, it is the long list of cloud BI solution scenarios available to clients. Picture this: a company decides to focus purely on the data; this means it could look at an operational data store or data warehouse, fully or partially, in the cloud. Alternatively, it could consider hosting the entire visualisation stack, or a partial subset thereof, in the cloud. For example, any combination of operational, management and strategic level reporting and self-service analysis can be hosted in the cloud. Additionally, many product vendors are now supplying cloud-based advanced analytics platforms – confusing, right?

To further compound the issue, certain vendors can also accommodate the entire set of solution scenarios in their cloud technology stack, while others only address a subset of these. Added to this is the fact that certain vendors focus exclusively on supplying infrastructure in the cloud, allowing any combination of software products to be installed.

As a result, deciding on which approach to take can be an overwhelming task and makes defining the strategy extremely challenging.

I believe navigating the myriad available options successfully comes back to the basics – which means gaining a clear understanding of the actual business case.

Starting point

Reducing capital investment, by reducing onsite hardware costs, is typically the first business case considered for cloud BI.

This is closely followed by increasing the speed and decreasing the cost of scalability. Cloud BI environments are inherently scalable in terms of storage, software licensing, processing power, etc, and that scalability is on-demand. This means the business can scale up in peak times, during major marketing campaigns, for example, and then scale back down, when required, paying only for what was used.

Reducing maintenance costs, via reduced on-premises energy consumption, software maintenance and upgrades or standby and support are also valid IT-driven business cases. In evaluating this business case, however, companies need to be careful to consider the human impact this might have on making IT maintenance staff redundant. Care, consideration, and long-term skills transfer need to be examined to minimise redundancy needs, in line with this business case.

The positive impact that cloud BI can have on business continuity is growing rapidly in the South African context.

A business case that sometimes does not get enough exposure is improving the IT adoption rate of new features. Traditional IT departments are, at best, on latest version minus one. Many clients, however, are two, sometimes even three, versions behind. Being cloud-enabled ensures companies have access to the latest features, allowing them to respond to emerging trends and take advantage of new features far earlier than potential competitors, which are still using on-premises solutions.

Reducing resource costs and increasing productivity is a major factor to consider from the business point of view. Through improved collaboration, the business community can be more actively involved in creating and sharing content, freeing up traditional IT resources and improving the end-user experience, all while increasing information delivery times.

The positive impact that cloud BI can have on business continuity is growing rapidly in the South African context. Business continuity is improved as there is less dependence on local resources, so BI services can still be available during local infrastructure outages.

These days, it is sometimes difficult to tell where big data begins and cloud BI ends, given the two are so closely intertwined. Cloud BI enables big data integration, as it brings organisations far closer to online big data services. Many cloud BI products come with built-in interoperability, with various cloud-based big data services. This means large volumes of data can ‘stay’ in the cloud, without having to clog the local bandwidth and processing capability.

Having understood the business cases that are relevant to the company, it then becomes important to understand the use cases where cloud BI can be employed. These are typically focused on collaboration; for example, regionally and/or internationally distributed sales teams, groups of departmental power users, etc. Another area of collaboration, which is seldom considered, is sharing BI with third parties without having to expose the internal network.

The benefits of cloud BI can be virtually limitless, but so too are the options and permutations of a cloud BI implementation. It is therefore critical to have a clear idea of what is important to the business, now and in the medium term, as doing so is the critical first step in the cloud BI journey.

Source : IT Web

[vc_row][vc_column width=”1/1″][vc_column_text]

Although accountants have used physical spreadsheets for hundreds of years, the revolution of computerised self-service tools has been on the rise since the “Tale of VisiCalc”, an interactive visible calculator invention by Daniel Bricklin and Bob Frankston in the late 70s.
VisiCalc laid the foundation for Lotus 1-2-3, which established itself as a data presentation package as well as a complex calculation tool that integrated charting, plotting and database capabilities. It was also the first spreadsheet vendor to introduce naming cells, cell ranges and spreadsheet macros in the early 80s. Microsoft Excel was the next milestone in response to computerised self-services tools in the mid-80s. Self-service analytics as a need is in fact no stranger.
Most aspects of people’s lives are inundated with self-service alternatives. Companies are more frequently offering alternatives to “do it yourself”. Examples include airline check-in, automated teller machines for banking, public vending machines for a quick snack, as well as kiosks for settling shopping mall parking fees – all of which have enjoyed high adoption around the globe. Their success in adoption has been attributed to the ease-of-use of the self-service terminals and portals. Many companies have seen greater cost savings in their support costs, as well as improved service delivery due to these self-service alternatives.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Defining the self
Gartner’s IT glossary defines self-service analytics as a form of business intelligence (BI) in which line-of-business professionals are enabled and encouraged to perform queries and generate reports on their own, with nominal IT support.
It is often characterised by simple-to-use BI tools with basic analytic capabilities and an underlying data model that has been simplified or scaled down for ease of understanding and straightforward data access. This promotes the notion of a shift from IT-led enterprise reporting to business-led self-service analytics in which business users are encouraged to “feed themselves”. The definition also supports the approach in which a semantic layer is prebuilt and a BI tool that is easy to use is presented to access the data.
Ideally, training should be provided to help users understand what data is available and how that information can be exploited to make data-driven decisions to solve business problems. However, once the skilled IT professionals set up the data warehouse/marts that support the business needs, users should be able to query the data and create personalised reports with very little effort. Slow adoption in self-service culture is mostly attributed to computer tools that required specialised knowledge to operate.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Experts only
Until recently, existing self-service BI tools were mostly for specialists – they were hard to operate and required a knowledge level similar to that of data scientists. Front-line business managers who desired BI-style insights had to send query requests to BI specialists working in the BI department, and had to wait for unbearable turnaround times to get reports that were difficult to change or influence. All this is changing, due to advances in database and query technology, as well as redesigned front-end tools to make it easier for any user to interact with the data.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_blockquote type=”type1″]

Self-service BI attempts to generate new insights through shared responsibilities.

[/vc_blockquote][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

The concept behind self-service is that front-line business executives and managers should be able to quickly get up and running with these tools, without having a data analysis background and without requiring a BI specialist as a middleman. Generally, these self-service BI tools should be as easy to use as the typical spreadsheet enabling a user to query data, analyse the answers, and create some kind of visual representation of this data that is suitable for presentation or sharing with other non-technical personnel.
Self-service BI in no way overthrows traditional database management or data scientists. The insights provided by these professionals are complex in nature and remain invaluable. Instead, self-service BI attempts to generate new insights through shared responsibilities, realising new value from hard-won data through more informal, ad hoc analysis.
The business need for self-service tools has always been around and has not changed much over time. What has changed, and continues to change, is the technology used, the data available, and the culture/expertise of information use. New technology possibilities are nurturing the self-service culture in recent times.
The increasing adoption is confirmed by the exponential growth in annual revenue for the three “leaders” in Gartner’s 2014 Magic Quadrant for Business Intelligence and Analytics Platforms (Tableau, Microstrategy and Qlik).
I concur with CEO of Clarity Solution Group Mike Lamble’s opinion: “In the self-service paradigm, ‘power users’ triumph over portal users. Tools are analytic-centric rather than reporting-centric. Business discovery supersedes information delivery. Semantic layer-free data exploration and rapid prototyping are where the action is.”

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column width=”1/1″][vc_column_text]

A few years back, big data landed in the world of analytics with a rather unflattering and unstructured thump, very nicely hash-tagged with phrases like ‘the next big thing’, ‘powerful’, ‘unprecedented insight’, etc. The sheer volume, velocity and variety had data analysts frothing at the mouth.
Fast-forward a few years and it has become increasingly apparent that the volume, variety and velocity is increasing exponentially, with absolutely no signs of slowing down or tapering off. If anything, it is going to get worse. The more IOT connected ‘things’ that are invented or added, the larger, faster and more disparate and ‘uncontextualised’ big data is going to get. This is a very large and fast-moving problem.
Already, there is way too much information with not enough talent to manage it or time to sift through it. The longer it stays untouched and unused, the more context is lost and the more data is lost to data decay.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

For one moment, consider the following:
According to IBM Watson¹, unstructured data accounts for 80% of all data generated today. With the majority of that data being noisy, in formats that cannot be read by traditional systems and ‘uncontexualised’, this noisy and dirty data is expected to grow to over 93% by 2020¹.
Fuel – oil platforms can have more than 80 000¹ sensors in place. A single platform can produce more than 15 petabytes of data in its lifetime¹. Tools like Watson could help companies prevent drilling in the wrong place and help with flow optimisation (the volume and rate at which oil is pumped).
Healthcare – in your lifetime, you will generate 1 million GB of health-related data[1]. That is the equivalent of 300 million books. Imagine what a computer that can collate and predict quickly and accurately could do with that much information?
Transportation – By 2020, 75% of the world’s cars will be connected¹. They will collectively produce approximately 350MB of data per second to be assessed and acted on. Self-driving and self-learning cars will soon be the norm. By their very nature, they will need to be able to learn and apply reasoning. Governments are not going to re-grid their entire road infrastructure.
Added to these scaling volumes is a huge shortfall of talented analysts and data scientists. Those that are around simply can’t keep up with the ever growing volumes of data. This shortfall presents a massive problem for business, because even the most advanced data platforms are useless without experienced professionals to operate and manage them.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Answers please
So, then, what is the solution? More training and better academic programmes? Possibly, but the exponential nature of big data means users are always going to be playing catch-up. So, another solution needs to be found: A scalable and fast solution that can leverage insight at close to the same speed as big data is collated and collected; a solution that is as close to keeping the original context of the volume as possible. Say hello to Watson[2] and Coseer[3].
The future of big data is finding, scripting and training computers to do the work for people. Computers that think the way humans think, that use context to flesh out meaning, and can think outside of a rigid decision tree logic.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_blockquote type=”type1″]

What will result is limitless possibility.

[/vc_blockquote][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Computers that cognitively can make decisions and learn from each and every interaction, at speeds that humans can only dream of.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

What, exactly, is cognitive computing?
Simplified, cognitive computing is the creation of self-learning systems that use data mining, pattern recognition and natural language processing (NLP) to mirror the way a human brain works, derives, contextualises and applies logic. The purpose of cognitive computing is to create computing systems that can solve complicated problems without the constant of human oversight, and in the process, far surpassing the speeds at which humans can do it.
What will result is limitless possibility. This is perhaps as close to true agile computing as it will ever be.
Cognitive computing is all about changing the world and entire industries, being able to see things that were lost in the volume, and finding insight that people have not been able to grasp before.
Today, 2.5 quintillion bytes of data¹ is created everyday – that is 1 000 000 000 000 000 000 000 000 000 000 bytes1. Every person on this planet will add 1.7Mb of data to that statistic, every second of today.
Human intelligence simply cannot scale in the same way that data is scaling, and cognitive computing enables people to deal with these massive amounts of data. Don’t get me wrong – cognitive computing can never replicate what the human brain does. It is simply a system that can handle massive amounts of unstructured data, fast and accurately.
The insight that could be provided is immeasurable.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_separator separator_type=”thick_solid”][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_list icon=”no-icon”]

[/vc_list][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Source http://www.itweb.co.za

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column width=”1/1″][vc_column_text]

In business, having your foot in your mouth can easily translate to a foot out the door. So when the ICT world makes a noise, the aim is to attract new business.
They don’t talk about themselves, especially companies that sit in the food chain between vendors and clients. Yet, these companies help make up a R13.9 billion local industry, excluding telecommunications. So Brainstorm decided to turn the tables, so to speak, and ask its roundtable attendees: what sits behind the sales pitch?
The people of ICT are as regular and congenial as anyone else, but their brands often demand a bit of swagger and bravado. ICT companies court big fish with big contracts – cloud may have broadened the market to smaller customers, but it remains a highstakes game.
So when the opening question is an opportunity for attendees to buff their brand, a steady march of jargon and catchphrases appears. One person even manages to capture the entire mantra in a single comment: “You must innovate and iterate, but one must also be cautious of cutting-edge delivery. You want to deliver solid solutions, but also be at the forefront of technology. People you work for want to know you deliver top-end solutions and give them what they need.”
This is what practically every ICT company would and does say. But similar comments don’t even make a complete round before the conversation begins turning inwards, speculating on the direction of a very competitive industry.
“With quality comes a price and we’re at a time where companies are very cost-sensitive,” says Lance Fanaroff , joint CEO of Integr8, wondering if customers are going to forego quality in tough times. “It’s about whether that additional quality warrants them spending the additional money. As South Africa comes on the squeeze, there will be more of a focus on price than quality.”
Bruce Pitso, South African regional manager at Ruckus Wireless, agrees that there’s pressure from customers to reduce costs or be undercut, but sees it as a more long-term trend that emerges as business leaders take control of ICT: “You get organisations that will want quality at any (or great) cost. But you get other companies that say, ‘If it will work at a cheaper cost, we’ll go for it’. People who decide about technology often don’t know much about technology. So the best of breeds will come in and cater a project for a long-term investment. But because the supply chain has been instructed to go for a more cost-effective solution, it’s a challenge. And we are headed in that direction.”

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Quantity race
At this point, it warrants a reminder that ‘ICT’ is a catch-all phrase that involves many different types of technology and implementation. Pitso’s comment gels with that of companies that count hardware and infrastructure as a large part of their business. Whether you look at the rise of Chinese manufacturers or fibre networks threatening incumbents, there is a quantity race in those markets that can override quality, something customers are taking advantage of (if the risk seems fine).

But Decision Inc.’s CEO Nicholas Bell notes that this is not the case in the more service-orientated ICT sector: “On the professional services side, it’s slightly different. Quality is no longer the diff erentiator – it’s the norm, the expectation. The market wants it for less, but it doesn’t want the quality dropoff . So with a large enterprise, you can’t increase rates or make great margins. They are squeezing us on rate, but the quality remains constant. These days, they have options and take advantage of that.”
But is there even a choice? A popular military maxim says that a good plan now is better than a perfect plan later. Yet in ICT, a good plan now often leads to big headaches later, says Professor Barry Dwolatzky, director and CEO at the Joburg Centre for Software Engineering. “There is a concept bandied around called technical debt. If you rush to market with something that is cheap now, it will cost you later. We need to get it out of our minds that you do something well or you do it cheaply. We have to find ways to do things well and cheaply and use that in the same sentence, not as flipside to the coin.”
Yet, companies can’t just expect cutthroat pricing if they dug themselves into a hole, says Warren Olivier, regional manager, Southern Africa for Veeam Software: “Often companies didn’t invest in an entire solution, only a part of it. There hasn’t been that entire end-to-end focus.” The result is patchwork environments that are never brought ahead on the curve.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_blockquote type=”type1″]

We’re at a time where companies are very cost-sensitive.
– Lance Fanaroff, Integr8

[/vc_blockquote][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Nonetheless, this spells more opportunity for the ICT industry: “I’m not going in to force this down the customer’s throat,” says Olivier. “I’m going to say, you got a bit of this and that. I’m going to find ways to leverage more out of this. That’s where partnerships and alliances – co-opetition – are important.”
But delivering on those intentions and partnerships requires a key ingredient, one that is becoming ever scarcer in the country.

[/vc_column_text][vc_column_text]

Falling behind
Earlier in the discussion, Bell had remarked: “(Enterprises) can’t afford for you to learn on their time. Everything is more urgent and must be out faster.”
He was talking about the demand for quality despite cost, but this also touches on a much bigger issue, one that requires much more buy-in than it is getting: skills.

Saying the ‘S’ word almost always draws the same response from IT professionals: a metaphorical roll of the eyes and a, ‘yeah, but what can you do?’ look. Yet skills are a serious problem. South Africa is not producing enough of them, particularly in the ICT field, and that shortfall is growing as 21st-century technologies start making an impact.
“There is a big gap between quality skills and the quantity of those skills,” says Dave Ives, head of Solutions at Karabina Solutions. “We have quality skills, but if I look at the new skills – machine learning, new languages and the stuff we’re encountering in the predictive space – I would say we have a skills shortage. If I look at the CRM and digital transformation space, taking a company to end-to-end transformation, I question if we have the depth and capability in this country.”
The big issue, he adds, is the lack of a large pipeline of people coming into the sector. Ives isn’t alone in this concern: an annual survey from Wits University’s Joburg Centre for Software Engineering last year found South African ICT skills to be lagging far behind Egypt, Kenya and Nigeria.

“We invest too little in skills as an industry and country,” says Dwolatzky, adding that this burden is too often laid at the feet of universities and government. Instead, the ICT industry needs to become much more involved and address its own culture. “If you look at the companies in India, for example, they recruit people from universities and put them on ten months of intensive training before putting them to work. They invest in their skills.”
Local companies throw newcomers into the deep end, then complain that the skills are rubbish: “We have to put the spotlight on what we as an industry are doing to produce the skills we need. There is plenty to complain about, but it’s all of our responsibility.”
“If I was approaching a vendor, I’d ask, ‘Do they actually contribute by growing skills?'” adds Kim Andersen, CTO at T-Systems South Africa. “In India, they decided IT matters to India’s economy. That has transformed the country. South Africa hasn’t made that decision yet.”

[/vc_column_text][vc_column_text]

A pool of sharks
Due to the lack of decent local skills, it has created a market defined by scarcity: high salaries, low retention rates and relentless headhunting.
“A lot of companies in SA don’t look at skills as an investment,” says Pitso. “They do it because they have to – they’ll take in interns as a tax kickback. But if someone is studying software programming, they aren’t stupid. So they will exploit this and get a better job.”

That lack of an investment mindset is, instead, in the words of one attendee, creating a pool of sharks. For example, when one of the country’s major banks needed skills for antiquated Cobol systems, they started an academy to train those skills. But other companies, instead of partnering with the initiative, snapped up graduates as quickly as they could.
This took place between large entities such as financial institutions and government departments bolstering their in-house talent pool. The trend is creating a negative impact on the much smaller ICT market, which is often burdened with the expectation to train skills that they know will be lured away.
“I used to keep guys for three years,” says John Eigelaar, director and co-founder of Keystone Electronic Solutions. “Now most of them leave within six months to a year. I lose people before they are even at a useful stage!”
Adds Bell: “The problem is that the large companies such as the banks can off er salaries that don’t fall in line with the market we play in. They create this ceiling that makes it very hard for others to compete. So you have guys with a year’s experience getting double their salary and the industry loses them.”

But while there is agreement that the country needs more skills, not everyone sees the above as entirely negative. It may also define a feedback loop that helps the industry.
“If you train proper skills, wherever they go, they will generate more work,” says Armandè Kruger, regional sales director at PBT Group.

Armande Kruger - Regional Sales Director at PBT Group

Armande Kruger – Regional Sales Director at PBT Group

“Instead of getting a slice of the pie, let’s grow the pie. It’s not a perfect model, but there is another side to it.”
Still, Dwolatzky makes a clear call to arms: “Let’s all get around the table, let’s run a programme jointly. We all contribute and create a big pool of skills we can then all fish from.”

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

‘Sinful crimes’

ICT companies are not entirely victims. Skills are also scarcer because of the influx of new technologies and customer industries. Business ICT has been booming, creating a wave that the industry is not only happy to ride, but tends to add its own hubris of ‘must have or die’ technologies. Raising this habit leads to a rare moment of admonishment from the attendees. Says Dwolatzky: “There is a concept called technological determinism. Does the technology drive change in business or does change in business drive the technology? We as technologists fall into the trap of technological determinism. We think we invent a new widget and that widget will change the world. In fact, it’s business that is changing things.”
The conversation starts around a question about the cloud: today, the mantra for salespeople is that cloud helps companies to innovate. But go back only a year or so, and the message was more about the cost benefits of using cloud infrastructure. This isn’t an atypical example of messages ICT sends to customers. As business takes more interest in technologies it doesn’t quite understand, the result is confusion – and ICT has been exploiting this.
Says Andersen: “The IT industry has long been committing sins: we’ve sold technology above and beyond the needs of business. We’ve pushed technology because we thought it was so great. But what is the business value?”
Yet, the real trap may be a matter of ego: solution providers cannot come across as incompetent or clueless. So they often have to toe the line for a technology that itself has yet to really define its usefulness. Fanaroff draws on the popular example of cloud: “Cloud means different things to different people. Meanwhile, it’s waiting for infrastructure to catch up and offer richer services to companies.”
The issue with new technology, he says, is that the use cases are not always there yet and it takes time for the market to find them. This tends to rely on other parts of the puzzle, such as connectivity and cost, to match expectations.
“Technology matures, so the message changes all the time.”
Technology is a moving target, which means those selling technology often have to run and talk at the same time. But those doing the buying shouldn’t think they are immune. Everyone in the ICT game should understand how highly fluid it is. Says Kruger: “Innovation comes from passion – once something becomes commercial, the innovation stops. And innovation these days happens at the speed of fibre, driven by a new generation that expects quick delivery. So innovation should be pushed from the bottom up. But if it only lives for a year, it lives for a year and then you move on.”

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_separator separator_type=”thick_solid”][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text]

Source http://www.itweb.co.za

[/vc_column_text][/vc_column][/vc_row]

Armandè Kruger, Regional Sales Director for PBT Group


There is no denying the sheer amount of data at our disposal. And thanks to the Internet of Things (IoT), data is increasing by the minute. However, for those of us that work with data or need to understand the influence this data has on businesses/clients or markets, we are now faced with the task of sorting through this mass of information from various interconnected devices.

Those who rate IoT as a fad, should consider the transformational impact this is likely to have on business, specifically on the data centre side. Take for example the amount of data that these interconnected devices are producing (and will continue to produce). Surely this alone offers organisations an opportunity to analyse the data and use this information to gain a much stronger competitive advance in an increasingly saturated business market?

In fact, Gartner estimates that 6.4 billion connected ‘things’ will be in use worldwide this year, which is a 30% increase from 2015 alone – indicating that the IoT phenomenon is not going away. As a result, decisions around how best to process the associated data from these various devices, should be an organisational priority in 2016.

Along with this, analysing the data is key – and a practical way businesses can go about doing this is turning to advanced analytics. In fact, companies can apply advanced analytics to their entire data realm, to ensure that they can leverage off the information and integrate it within various areas of their business.

In other words, advanced analytics can be used to turn data coming from various business areas; including the IoT, sales, marketing, call centre feedback, transactional data and even social media, into something more powerful. Can you imagine the impact if a business could not only manage the data within, but also analyse it correctly, to make business decisions that are accurate, add value to customers, improve the bottom line and in return increase profit margins? This is what will give a business a new and substantial competitive edge – and this is why IoT is no fad.

Advanced analytics can actually generate predictive information for a business. This means that an organisation can now gain insight into future outcomes using advanced analytics. And it is this that allows advanced analytics to assist a business in becoming more customer focused, as the data being analysed can tell a business more about their customer’s needs, wants and expectations. Having this information at their fingertips means businesses can effectively satisfy a customer’s needs instantly.

So, as organisation’s start to make sense of 2016 and the challenges and opportunities the year is likely to bring, remember that the IoT and data is starting to play a very important role in the success of companies globally (if utilised correctly).

Don’t discount the impact these so called technology terms will have on your own business. Additionally, be careful of thinking that if you have implemented some form of big data, you will solve your data problem – as this in isolation offers very little value. Analysing the data correctly through aspects such as advanced analytics, is what will help determine how you can create or capitalise on opportunities in this digital marketplace in 2016.

Source : IT News Africa

 

By , strategic BI manager at PBT Group.

The fellowship of humanity is founded in stories – stories that evolved from cave drawings to Shakespearian writings to the modern-day stories on the cinema canvas. I know the debate between reading versus watching movies is probably as old as the television itself and is still ongoing. However, the fact that cannot be debated is that people are infatuated with stories.

The visualisation of stories in movies has simply made stories more accessible to the portion of the population not inclined to find reading as enjoyable as others. This does not mean books are redundant, as some people will never sacrifice the joy of immersing in their own imaginations through the written word, but ultimately, visualisation of stories continues to enrich a much wider part of the population.

In a similar way, one of the latest buzzwords in business intelligence (BI) – visualisation – is causing quite a significant uproar. Let’s set the record straight from the beginning: Visualisation is not the new BI. Visualisation is just an added medium through which one can publish the intelligence in the data so a larger portion of the company can benefit from discovering the story ‘hidden’ therein.

Picture it

Way back, in the Shakespearian BI era, data and intelligence were expressed in data tables. It never ceases to amaze me when I come across BI end-users who can glance at a data table with 25 columns and 72 rows showing regional weekly sales data for the last 18 months, and – within seconds – become excited by the trends they observe in the data. Yes, such data whizzes exist, and sometimes leave me reeling, convinced that Neo stepped out of ‘The Matrix’ through my computer screen. However, I tend to fall on the side of the masses, and for me to make sense of large data sets, the golden rule applies: “A picture is worth a thousand words (or numbers, in the case of BI).”

Over the past couple of years, managers have started to realise that, instead of being solely dependent on a small team of highly skilled quantitative analysts to analyse and interpret the data for the masses in lengthy book reports, the latest technological advancement in visualisation capabilities of tools, like PowerBI, Qlikview, and Tableau, among others, may unlock the story in the data to a wider audience, much faster.

However, don’t be fooled into thinking it is as simple as putting the tools and the data in the BI users’ hands, and – ‘hey presto’ – BI value is unlocked by the masses for the masses. It is not that easy. Returning to my analogy of books versus movies, consider how many people are involved in publishing a book versus bringing a movie to the silver screen. The book involves the writer, an editing team, the back-cover writer and the publisher. Judging from the credits on a movie, it can take a team of more than 100 people to effectively tell one story on the telly.

One must understand that to unlock the story in the data through visualisation takes very careful planning and design, to ensure the right visualisation mechanism (bar graph, line graph, heat map, xy plots, etc) is used that best tells the story. I’m yet to come across a company where all BI users just intuitively know how to match the right mechanism to the data to effectively answer the business question they have.

Unlocking value

Cue the role of the data visualisation architect (Google it, such a person exists), who is almost like the scriptwriter, location manager, set designer, casting director and director of photography all rolled into one. This person effectively combines the right data (script) with the right visualisation mechanism (set designer), and the right formatting and structure (director of photography), and then publishes it on the right platform (location manager) to the selected target audience (casting director), who will then effectively utilise the insight to the value of the organisation, resulting in a conclusion of the story. The data visualisation architect will know the science behind how visualisation is perceived and physiologically processed by the viewer, and use this specialist skill to design the optimal visualisation for each and every insight story embedded in the data.

As a side note, I must caution the reader not to confuse the data visualisation architect with the other buzzword, data scientist. That would be like demoting the director to become the set designer. Granted, data scientists have a very strong understanding and a keen, almost intrinsic, ability to design visualisations that tell a very clear story. However, a data scientist is usually the person who drives and steers the whole journey of discovering a new story, scripting, designing, casting, recording, and producing the movie. Data scientists are far more valuable in the exploratory analytics space where the business question is still being formulated, the hypothesis must still be defined, the suitable data sourced and analysed, and finally, the conclusions drawn and presented to decision-makers.

Just like the data scientist discovers untold stories hidden in the data, the data visualisation architect can enable business to unlock the value intrinsic in their existing data for decisions made on a daily/weekly/monthly basis. This is done by converting existing reams of data-table book reports into well-crafted visualisation views that tell the story succinctly in an aesthetically pleasing way – empowering the other 80% who cannot intuitively see the picture in the data, like Neo.

Source : IT Web

Petr Havlik, Managing Director for CyberPro Consulting

How to effectively manage big data is something all companies need to be aware of. When it comes to a data-driven business like insurance, it is even more critical to success or failure. Are local insurers adequately prepared for this?

The concept of big data is certainly not a new one. Many companies have been trying to better deal with the sheer amount of raw data they have at their disposal sifting the good from the bad. But to really manage it properly, executives are faced with difficult decisions in identifying the exceptional technologies able to efficiently process large quantities of data within acceptable time frames.

Cynics might argue that big data is just a buzz phrase and ignore it. However, an IBM study has found that 74 percent of insurance companies surveyed report that the use of information and analytics, including big data, is creating a competitive advantage to their organisations.

In South Africa, the rise of the connected lifestyle is resulting in customers who demand more from their insurers. This connectedness has also given rise to more informed consumers who are better aware of competitive offerings than in the past. Not only do they want better pricing but they also expect innovative, value-adds that appeal to their lifestyle requirements. If an insurer is not able to deliver this, then the customer is more than willing to change companies. Insurance brand loyalty is a thing of the past.

This means that insurance companies now compete on multiple levels ranging from premiums, customer services, and claims experience, to brand recognition and product structure, amongst others.

And the foundation to all of this? Quality data.

An insurer needs to implement the kind of IT systems that empower them to make informed decisions based on their customer requirements as well as market trends that will impact them from the short-term through to the long-term.

According to IBM*, insurance companies must leverage their information assets to gain a comprehensive understanding of markets, customers, products, distribution channels, regulations, competitors, employees, and so much more.

Of course, it is not all about just big data. Using Business Intelligence tools that integrate data management and analytics become essential to build the right kind of information needed for making quality business decisions.

Fortunately, South African insurers are willing to adapt. A case in point is the flexibility of solutions available to cater for a range of consumer needs. This could of course not have been developed without getting to grips with big data and testing new solutions. The future is looking promising for those insurance companies that have taken heed of the call to arms and are starting to realise the treasure trove that is their big data.

Source : Cover

At a time when cloud computing is becoming fundamental to business, the importance of integrating systems effectively cannot be overstated. Petr Havlik, director for CyberPro Consulting, looks at the impact this will have in South Africa.

 

“Historically in IT, developing software systems and utilising things like business intelligence solutions were considered separate disciplines. However, this has all changed given technology needs to be much more integrated in order to help the decision-maker gain a single view of the operational areas and customers in their business – and ultimately become a more partner-driven business.”

 

Such an approach reflects a growing shift in a world where companies are looking at expanding their traditional business lines with more value-added offerings. For example, renewing a passport at a bank could never happen without having an integration between multiple parties. The connected world is now seeing organisations playing multiple roles in the lives of their customers.

 

“Just look at what is happening in the South African landscape. You have telecommunication providers muscling in on the banking space, banks providing all sorts of value-add online offerings, and numerous other companies across a variety of sectors identifying different ways of adding to revenue streams.”

 

Havlik says that this diversification and working with different business partners provide an organisation with a great platform to be successful in the ‘new world’. However, while integration is topical and relevant, it is certainly not very sexy given its focus on back-end processes and systems.

 

“The changes that cloud computing bring to the integration landscape can be exciting and frightening at the same time. In the past, integration resulted in significant infrastructure and skills investments needing to take place. Using a cloud platform, such as for example Microsoft’s Azure platform, provides the business with access to a centrally-hosted environment providing those services.”

 

This gives companies access to a highly available and scalable environment that can be utilised to integrate with their partners. Going the cloud route brings with it significantly lower costs and faster implementation times.

 

“In certain respects, the benefits of integration are providing traditional-minded businesses the first real use cases for adopting cloud-based systems. Even concerns around security and privacy are being addressed thanks to the strong security measures adopted by cloud providers, although customers must remain cognisant of the facts around data storage in offshore locations.”

 

And while there is a need for more multinationals to open data centres in South Africa to address the need to keep certain information within the confines of the country, the reality of seeing this happen is still a few years away.

 

“Irrespective of this, integration is something companies are starting to take more seriously outside of the traditional confines of the IT department. The benefits of doing this effectively cannot be ignored for longer,” Havlik concludes.

Source : The SA Leader