By Yolanda Smit, strategic BI manager at PBT Group. Johannesburg, 4 Nov 2016
Edging towards the end of yet another year, the Christmas paraphernalia in commercial districts leaves one in shocked silence at the thought that another year has come and gone. Those who have experimented in agile development practices cannot resist the prompting that the end of a cycle calls for a moment of retrospection.
Meditating on the last year of the business intelligence (BI) industry trends ironically enough finds ‘agile’ thinking has become far more prevalent in BI circles. Companies are starting to realise that traditional BI development life cycles of three to six months are no longer good enough. Information of high quality has become a key ingredient to business success, and therefore, the demand for faster delivery is becoming more pervasive.
However, the insight of those who experimented with agile development practices in BI has started to mature, with the realisation that BI is typically far more complex than application development. Therefore, more disciplined approaches to agile are required in order to effectively scale, and enable shorter delivery cycles to meet the needs of decision-makers for ad hoc decisions. Not only does BI delivery become faster, but disciplined agile BI delivery enables agile business with improved responsiveness to new opportunities or changing circumstances.
These agile practices make the BI delivery team more user-centric, in a weird way mimicking the realisation of a common business strategic ambition to become more customer-centric. It is this strategic objective that is driving the second observed trend of an increased take-up of big data. The ambition of customer-centricity has been evolving over the last couple of years in SA, but the realisation is starting to hit home that a company can only truly be customer-centric if it know its customer. Multitudes of sources containing fragments of knowledge about customers can be consolidated in innovative ways using big data technology to generate insights never thought possible before. Investment in big data technology exploration or proof-of-concepts has increased significantly this year, as companies are preparing the way for the realisation of their customer-centricity ambition in the next two to three years.
BI is typically far more complex than application development.
Upstream, in the data provisioning value chain, there has also been an increased investment in more mature information management capabilities. Companies are investing in robust master data management capabilities that enable more robust data integrations from fragmented sources. The quality of data upstream is also being measured and monitored intentionally, in order to proactively ensure the high quality of the information downstream provided into analytics for decision-making. All of this is being managed more intentionally, as companies, especially in the financial services industries, invest a focused effort on maturing their information governance practices. Concepts like data compliance, data owners, and data stewards are slowly becoming part of the mainstream business management glossaries.
Downstream, analytics has also seen significant increase in maturity over the last year. BI strategic focuses are moving away from the masses using BI for standard management information reporting or dashboards, and the attention becomes concentrated on key power-users who require high-powered and advanced analytics. They go by various names, such as quantitative analysts, statisticians, actuaries, management accountants, etc. However, these are all just various stage names for the user profile known as data scientists.
These are the users who approach the BI team with new query or view requests almost on a daily basis, and the query result or view is probably only required for once-off use. These are also the users who typically benefit from self-service capability to source their own information, manipulate it according to the business problem they are faced with, generate various analytic views through a large variety of diverse visualisation features in their analytic tools, or even employ some highly sophisticated predictive and prescriptive analytical capability.
These users are also the front-runners to lead BI into the reality of true cognitive computing in the near future. Although there hasn’t been real progress in the space of cognitive computing in the last year, curiosity is starting to rise and questions are being asked more often.
All in all, 2016 has been a year of rapid increase in maturity for management of information, as information is becoming ‘big’ – big on the strategic agenda of any company that wants to establish and maintain a competitive advantage in the new digital age, as well as big, as in big data technology.
As far as the delivery of BI capabilities goes, the need for agility rings as true as the sleigh bells of Christmas 2016 that loom ahead. BI is no longer a standalone capability that merely spits out beautiful reports. It is effectively becoming embedded into the competitive DNA of the company, and therefore, agile delivery will become a critical success factor to keep up with the proverbial Jones’s.
Source : ITWeb.co.za
By Juan Thomas, CIO of PBT Group. Johannesburg, 28 Sep 2016
Being a data scientist is one of the hottest jobs in America right now. In fact, according to research¹, close to half of the 25 ‘best jobs in America’ named are tech-related – and of these, data scientists sit at the top of the list.
However, when we look at the local landscape, is this reality mirrored here, especially if we consider the growth of the digital world and its impact on South African businesses?
The answer is a simple no – not, because the need for data scientists is not there. In fact, it is the complete opposite. There is a very real need for these unique technical skills in the local business market, especially when you consider the amount of data businesses find themselves dealing with, and this data continues to increase significantly year on year.
We know already that businesses are realising that if used correctly, data actually adds massive value to the bottom line and results in better business profitability. However, the data is often too complex and disparate and thus requires a unique skill set of a data scientist, if you want the data analysed to ensure that it can actually add value. A data scientist is someone who has the ability to harness a wide range of skills to translate raw data into a meaningful end result for the business, as well as communicate this result in a way which tells a story of interest to the audience. To do this, one usually possesses the following skills: technical IT, behavioural economics, statistics, visualisation, psychology, business knowledge.
Yet, South Africa still finds itself in a rather dire situation when it comes to these needed ICT skills. The results of the Johannesburg Centre for Software Engineering (JCSE) skills survey² reiterates this sentiment, confirming that the local shortage, in ICT, is still massive. Couple this with the fact that often technically skilled individuals are recruited to work overseas, has compounded this situation. The result, unfortunately, is a negative impact on the business environment, as organisations struggle to find the specialised personal they need.
From a corporate point of view, more companies need to get involved and become part of the solution. This can be as simple as supporting ongoing programmes already active in the market that encourage young employees to study further to develop their technical skills capabilities – based on what the market requires. Alternatively, businesses can develop their own programmes or encourage young employees to study further, eg. part-time graduate or diploma courses.
Furthermore, the public sector also has a great opportunity here – where it could provide facilities, like training centres and bursary schemes (over and above the current programmes, and ones specially focused on ICT) to assist young professionals becoming better skilled before, and when, entering the job market in the ICT space.
The need for specific ICT skills in the business world will likely not disappear anytime soon – rather it will only grow as innovation in this space continues. As a result, a career in this path will serve an individual very well. Corporations in South Africa should support the development of niche technical skills through IT education and by getting involved in programmes to assist and promote such ICT skills development. Without this commitment, we cannot ensure that the technical skills needed by businesses today, will be there in the future – these skills have to be developed if the generations to come are able to make an impact.
Source : bbrief.co.za
By Julian Thomas, Solution Architect Manager at PBT Group. Johannesburg, 4 Oct 2016
In my previous Industry Insight, I outlined certain important points to consider when defining a cloud business intelligence (BI) strategy. In summary, this came down to having a clear understanding of the business case (and associated benefits), as well as the business use cases (to understand where this would be applied).
At this point, if you had gone through this process, hopefully as part of a coordinated, managed project, you would most typically be right in the middle of the hype cycle of the project. At this stage of the process, everyone is really excited about the expected benefits identified in the strategy, such as improved performance, reduced costs, scalability, agility, improved planning, financial forecasting and costing.
As a result, there is tremendous energy and pressure to proceed rapidly forwards. However, what about the potential downside? And, what about the risks? It is crucial that not just the pros of a cloud BI solution are evaluated, but also the cons. Knowing the potential challenges upfront will allow us to implement mitigating steps, or worst case, delay the implementation until potential issues/risks can be resolved.
While there is certainly no limit to the number of unique challenges that can be encountered across diverse, unique organisations, I believe there are a handful of standard, common pitfalls that organisations should be aware of, and proactively acknowledge and manage.
Data traffic: Initial and/or recurring uploads of data into the cloud – can be costly and time-consuming. This can have a significant impact on the batch windows and overall cost of ownership. Very often, additional, dedicated lines need to be set up to the relevant service provider to ensure optimal performance. This is especially true in the African context, where data bandwidth is costly. It is critical that a realistic assessment of these costs is included in the overall estimates and planning.
Security and compliance: Moving data into the cloud, especially personal information, can be difficult to obtain approval for. There is a great deal of caution in South Africa right now with regards to personal information. This can often result in a hybrid solution requirement, where certain data has to remain onsite, while non-restricted data is moved into the cloud. At the very least, additional due diligence needs to be performed to ensure regulations are not violated when adopting a cloud BI solution.
Pricing models: Pricing models for cloud BI solutions can at times lack a certain level of transparency. It is often initially easy to consider a cloud BI solution to be cheaper, but on closer reflection, based on pricing/usage, cloud BI solutions can sometimes end up being more expensive over the long term.
Governance: Adopting the cloud as a platform implies relinquishing a certain level of control and governance. On the whole, we are comfortable with this, as most of the vendors and platforms have demonstrated their ability to manage this on our behalf very well. However, it would be foolish to assume all organisations will appreciate this point. This is an important human element to consider and be mindful of.
It is crucial that not just the pros of a cloud BI solution are evaluated, but also the cons.
Having understood the potential cons, it is now important to define the critical success factors that the potential cloud BI solution needs to be measured on. These can once again include numerous points, but there are certain core principles that need to be clearly understood, quantified and measured, before continuing any further.
Data privacy and security: Carefully consider the minimum requirements of local and international regulations on privacy and security of data, which can limit what data is stored in the cloud, or hosted in specific countries.
Data transfer rate: Define the acceptable speed at which data needs be uploaded/downloaded in order to meet batch window and end-user requirements.
Data transfer volume: Define the expected data transfer volume and frequency, and evaluate within the context of existing bandwidth.
Data transfer costs: Define an acceptable cost per gigabyte of data transfer, taking into consideration any potential price escalation clauses based on volume uploaded or downloaded, etc.
Local availability: The importance of reliable Internet connectivity needs to be clearly understood and defined, particularly with regards to the impact that lack of Internet access can have on the solution and the business as a whole.
Cloud availability: The availability of the cloud BI service provider obviously has a huge impact on the success of these solutions. We expect cloud BI service providers to have stable platforms, but what are the organisation’s requirements and expectations regarding this?
Disaster recovery: Appropriate disaster recovery needs to be in place to protect data and solutions, as well as to meet regulatory requirements.
Suitable redundancy: This speaks to the ability of the solution to configure/select the level of redundancy to suit the nature, importance and usage of the data being stored in the cloud.
Change management: This speaks to the internal organisation’s capability to adopt the new paradigm. This is an important part of the successful implementation of the solution.
Understanding these points in the beginning of the cloud BI journey will yield great dividends in the future, as it lays the groundwork for all subsequent decisions around vendor and platform selection, and solution implementation options.
Source : ITWeb.co.za
By Juan Thomas, CIO of PBT Group. Johannesburg, 28 Sep 2016
Being a data scientist is one of the hottest jobs in America right now. In fact, according to research, close to half of the 25 ‘best jobs in America’ named are tech-related – and of these, data scientists sit at the top of the list.
However, when looking at the local landscape, is this reality mirrored here, especially when considering the growth of the digital world and its impact on South African businesses?
The answer is a simple no – not because the need for data scientists is not there. In fact, it is the complete opposite. There is a very real need for these unique technical skills in the local business market, especially when considering the amount of data businesses find themselves dealing with, and this data continues to increase significantly year on year.
Businesses are realising that, if used correctly, data actually adds massive value to the bottom line and results in better business profitability. However, the data is often too complex and disparate, and thus requires a unique skill set of a data scientist if the business wants that data analysed to ensure it can actually add value.
A data scientist is someone who has the ability to harness a wide range of skills to translate raw data into a meaningful end result for the business, as well as to communicate this result in a way that tells a story of interest to the audience. To do this, one usually possesses the following skills: technical IT, behavioural economics, statistics, visualisation, psychology and business knowledge.
Yet, SA still finds itself in a rather dire situation when it comes to these needed ICT skills. The results of the Johannesburg Centre for Software Engineering (JCSE) skills survey reiterates this sentiment, confirming the local shortage in ICT is still massive. Couple this with the fact that, often, technically skilled individuals are recruited to work overseas, and the situation is compounded. The result, unfortunately, is a negative impact on the business environment, as companies struggle to find the specialised personnel they need.
Given this, it is becoming clear that more in the way of skills development needs to be done. The JCSE survey is very clear about the fact that there is a need for industry and academia to step in and help SA build the skills needed to drive forward this new digital economy businesses find themselves operating in. Of course, it is great to see how schools and universities are starting to place a focus on programmes dedicated to IT skills development, but this alone is not enough.
From a corporate point of view, more companies need to get involved and become part of the solution. This can be as simple as supporting ongoing programmes already active in the market, which encourage young employees to study further to develop their technical skills capabilities – based on what the market requires. Alternatively, businesses can develop their own programmes or encourage young employees to study further, eg, part-time graduate or diploma courses.
Companies struggle to find the specialised personnel they need.
Those already in the field should speak to the company where they are employed to see if it’s viable to create skills development programmes and technical cross-skilling sessions and workshops, to encourage continuous learning within this space. Data does not stop evolving, so neither should employee knowledge and skills. Continuous on-the-job training with strong mentorship is key to developing the crucial ICT skills needed locally.
Furthermore, the public sector also has a great opportunity here – where it could provide facilities, like training centres and bursary schemes (over and above the current programmes, and ones specially focused on ICT) to assist young professionals in becoming better skilled before – and when – entering the job market in the ICT space.
The need for specific ICT skills in the business world will likely not disappear anytime soon – rather, it will only grow as innovation in this space continues. As a result, a career in this path will serve an individual well.
Corporations in SA should support the development of niche technical skills through IT education and by getting involved in programmes to assist and promote such ICT skills development. Without this commitment, industry cannot ensure the technical skills needed by businesses today will be there in the future – these skills have to be developed if the generations to come will be able to make an impact.
Source : ITWeb.co.za
By Masindi Mabogo, director at PBT Group. Johannesburg, 31 Aug 2016
The existence of games dates back to ‘human ancient days’. They were used as a channel for social interaction, knowledge sharing, developing mental skills, entertainment as well as teaching spiritual and ethical lessons.
Common game tools were made of bones, sticks, shells, stones, fruit seeds and shapes drawn on the ground. Their features¹ included uncertainty of outcome, agreed rules, competition, and elements of fiction, elements of chance, prescribed goals and personal enjoyment. In competition games, the reward was the social status (sole bragging rights) within one community or the thrill of reaching higher levels.
Games have always exhibited the psychological ability to 1) encourage participation through rewarding achievements, 2) influence behaviour through teaching, as well as 3) improve skill(s) through practical attempts. The progression of technology eradicated the limitation from ancient tools and provided infinite possibilities for gaming feature expansion. Over the years, the gaming world perfected and ascertained the effectiveness of these attributes, and the notion of gamification today is to draw the strength of these features into company activities.
Badgeville², a company that offers an award-winning enterprise gamification and analytics solution, defines gamification as a concept of applying game mechanics and game design techniques to engage and motivate people to achieve their goals
This concept taps into the basic desires and needs of the user’s impulses, which revolve around the idea of status and achievement. Many other narrations of this concept acquiesce that game elements such as points and rewards are linked to a goal/task as an incentive to encourage participation.
Gartner³ further redefined the definition to explicitly indicate the engagements have to be digital, meaning participants interact with computers, smartphones, wearable monitors or other digital devices, rather than engaging with a person.
There are 10 game mechanics pulled from the world of video gaming that are commonly inherited into gamification solutions. These are fast feedback, transparency, goals, badges, levelling, onboarding, competition, collaboration, community and points. Rajat Paharia, founder and chief product officer of BunchBall, discussed these mechanics in detail in his book: Loyalty 3.0: Big Data and Gamification Revolutionizing Engagement (chapter 4).
Gamification is gaining popularity due to its landscape that makes the hard stuff in life fun. Its addition to Gartner’s hype cycle in 2011 also propelled its popularity in the corporate world. In fact, Gartner* correctly predicted that by 2015, a gamified service for consumer goods marketing and customer retention will become as important as Facebook, eBay, or Amazon, and more than 70% of Global 2000 organisations will have at least one gamified application.
Gamification is gaining popularity due to its landscape that makes the hard stuff in life fun.
Many global organisations are already enjoying the competitive advantages derived from their gamification solutions. With more organisations coming on-board, major successes will be directly proportional to the value proposition of their incentives. Companies that have realised this are looking at innovative ways to make their incentives relevant and irresistible to their customers. A successful strategy adopted in recent times is to formulate partnerships that extend incentive permutations beyond the shorelines of the business.
For example, South African health and insurance companies have already partnered with clothing stores, grocery stores, hotels, flights, computer stores, cinemas, car hires, florists and many others, to expand their rewarding permutations. Their customers are already enjoying an array of incentives through these mutual alliances, while these companies are greatly influencing customers to strive for a healthy lifestyle, and in turn, entrenching genuine customer loyalty.
My everyday gamification experience is through the health insurance reward programme that tracks my active lifestyle and rewards me for reaching my goals, with yearly cashback (in currency) guarantees, free healthy consumables, shopping discounts and monetary savings for holidays.
I am currently addicted to my mobile running application, which allows me to track and compare workouts, set personal goals, invite and motivate friends into group activities as well as periodic challenges. I find this motivating and it guarantees my participation due to its appeal to my natural desires for competition, achievement and improvement. I am sure everyone can identify with a few examples in their personal experiences.
Generally speaking, the future success of gamification will largely depend on the assertiveness of the incentive to engage the participant in order to influence their behaviour, while meeting business objectives.
Bunchball*, the first company to offer a technology platform (Nitro) to integrate game mechanics into non-game digital experiences, advocates that customers are hungry for reward, status, achievement, competition and self-expression, and they’ll go out of their way to engage with the businesses that gives it the best.
By Jessie Rudd, BI consultant at PBT Group Johannesburg, 2 Aug 2016
In the space of a few weeks, an augmented reality game called Pokémon Go has managed to do the impossible. It has surpassed WhatsApp, Instagram and Snapchat and is on a par with Twitter for daily, active, ‘on the app’ users .
This is a game featuring Pokémon – originally a Game Boy video game that entranced children of all ages in the 90s . It has somehow, with the help of serious amounts of big data, rocketed itself into a whole new cross-section of the population.
Those who are not gamers, or addicted to their smartphones, are more than likely a little perplexed by the whole phenomenon. So then – what is Pokémon Go? To put it relatively simply, it is a location-based, augmented reality game for mobile devices, developed by a company called Niantic for iOS and Android operating systems.
Augmented reality is a technology that overlays a generated image on a user’s view of the ‘real’ world. This overlaying of images creates a composite view, viewable only via the application doing the overlay. So, basically, there is a game that is overlaying itself on Google Maps, directing players to various sites to ‘capture’ various Pokémon. The brilliance behind this second-generation phenomenon has been many years in the making.
It all comes down to big data.
In 2011, the company behind Pokémon Go, Niantic, released a game called Ingress. Ingress was one of the first of its kind. A fitness game, otherwise known as an exergame, Ingress relied heavily on the telematics that is present in all smartphones on the market nowadays. Telematics can be used to track body movement and reactions, which is exactly what Ingress did for Niantic.
This completely new and revolutionary method of collecting data used the game to direct users and players to various ‘portals’ or sites, which were initially extrapolated from geotagged photos on Google Earth. Players were also actively encouraged to submit more sites for consideration, and to date, around five million sites have been approved for use . These sites were suggested, collated, collected and verified using serious advanced analytics and big data.
These sites are approved for use by games like Pokémon Go, which, by the very nature of what they are, are going to give rise to a whole new mountain of big data; and not just any big data – relevant right now big data.
One of the biggest drawbacks and problems with big data is that it is, more often than not, historical data, without context and relevant meaning. Not in this case. For the first time, on a scale that can truly be called big, big data is relevant and extremely powerful. This may explain why many attempts to hack the game have already been documented. This may also explain why people are being warned to be careful – which may also be why this step in a new direction is a little bit scary.
For the first time, on a scale that can truly be called big, big data is relevant and extremely powerful.
Let’s be real for a moment, using a simple example. Google, with very little effort, is already fully capable of determining where someone is, how s/he got there, how long it took to get there, how long the person will be there for, etc. All this is calculated because the person set up an appointment on Google Calendar, synced a reminder with his/her phone, and looked up the destination on Google Maps. Without too much effort, an entire company knows exactly where this person is. The thing is, however, there is an inherent trust in ‘corporations’. Users assume, or hope, the companies have an ethos in place that will protect them from abuse or exploitation.
So, imagine then how much power someone with less-than-desirable intentions would have, should they be able to get access to the Pokémon Go server? How much traffic they would be able to direct or divert, exactly where they want?
Doom and gloom aside, though, let’s think about the practical applications of exergames like this. Let’s say I paid a company like Niantic to ‘place’ one of the ‘collectables’ near my coffee shop. My sales would skyrocket. The marketing possibilities are quite mindboggling.
Let’s take it a step further though. This same method of data collection could just as easily be tweaked by marketing companies to collect and collate real-time data. This data can then be stored and analysed to become intelligent data, giving invaluable insight into where a person shops, how long they stay in a shop for; and then, of course, billboard placements along the route they travel to get to the shop can be undertaken.
Having this data intelligence means a business can develop and offer customised offerings based on the initial real-time location that was achieved from the data collection. In fact, when looking at the bigger ‘data’ pictures and thoroughly following the data processes, the possibilities are limited only by the imagination.
Data collection like this, along with data analysis, is fast becoming mainstream. Unless marketing companies get good data collection methods in place, with well-equipped and forward-thinking analysts who can also analyse the data effectively, they are soon going to find themselves lost in the world of 1 and 0.
But right now, in a world of Pokémon Go mad and no real clue how to navigate around it, the answers might just lie in data collection, analysis and intelligence.
Sourse : ITWeb
YOLANDA SMIT - 22 July 2016
The progressive application of data governance to priority areas of data business provides immediate benefits while companies work towards the end-goal of automated data governance systems, says business intelligence firm PBT Group Strategic BI manager Yolanda Smit.
Ascribing the accountability of data to various business functions and formalizing the existing informal data management systems in line with business rules and requirements will immediately provide better oversight of business-critical functions.
For each rule and principle of data governance defined, adding new data and systems becomes easier and faster and reduces or eliminates risks.
Integrating and automating business processes and systems require that rules and policies be effectively applied to the data related to them. Data governance processes, thus, underpin the holistic transformation of the business, she adds.
Data governance also supports the data architecture of a business by ensuring that information is effectively referenced to provide accurate and comparable views. This reduces data storage, management and associated data-law risks while improving basic business functions.
“Standardization of data and data quality improves the efficiency of all business systems using these critical data. However, the best way to deal with these issues progressively and on a granular level is to determine what data is strategic or high priority and then manage those first.”
“We advocate a pragmatic and systematic approach to improving data governance. While our customers typically worry about the complexity, once we start to answer some basic questions for data governance – who owns the data and which manager is responsible for it – it is easy to identify the highest-priority work.”
The business rules inherent in any organisation can readily be unearthed and formalized, and companies are typically surprised at the ease with which data governance progress can be implemented and the value that is unlocked by improving data management, says Smit.
Finally, having good data management and governance systems in place is very effective to ensure control over the business and that it easily meets regulations within multiple jurisdictions, which is often the case with multinational firms.
Any further digitization, changes to regulations and information technology system improvements are also bolstered by high-quality data, with the business’s sustainability, therefore, enhanced.
The value of best practice data governance is more than just effective compliance and affords an opportunity to streamline processes, owing to detailed knowledge of data flows and processes, and data governance is often a catalyst for efficiency, concludes Smit.
Source : Engineering News
By Yolanda Smit, strategic BI manager at PBT Group.
Johannesburg, 14 Jul 2016
There’s an interesting buzz in the business intelligence (BI) industry, where more and more clients are starting to ask what BI of the future looks like. I think there are a few factors that drive this question.
Vendors’ messaging to their target market is definitely a strong driver that’s causing users of their technology to start wondering. The typical messages from vendors are that BI is dead and analytics replaces it. Depending on the vendor’s own internal strategy, it might either be pushing for cloud platforms for analytics like the big four (SAP, Oracle, IBM and Microsoft), or some of the more niche players are punting visualisation or advanced analytics as if it’s the be-all and end-all of the ‘new BI’. And if the market is not confused yet, they’ll throw words like search and text analytics, self-service, in-memory, or agile into the mix.
In the process of answering this question with my clients in the last year, I realised that companies have to get past the hype, and get back to understanding that all the available forms of BI exist for one reason only, and that is to meet actual business needs. Therefore, instead of asking what the future of BI will look like, rather ask what the future of the company’s BI should look like to meet the needs of its information users and support its organisational strategy.
Granted, each company is not 100% unique, and there are a few common business change trends that I’ve picked up in my experience with a variety of clients, where new and alternative BI technology approaches should be considered in order to adapt to the changing needs of business.
The first common trend speaks to how the general profile of data sources has changed over time. Traditionally, users’ needs were satisfied by providing them with operational data brought into a central data warehouse, combined with a stock-standard stack of BI tools (reports, scorecards, and dashboards).
However, now more users are struggling with additional, non-operational data from various sources, ranging from internal spreadsheets to external cloud data providers like Google Analytics. This results in the users creating manual work-arounds to cope with the diverse data, leaving the BI users feeling overwhelmed and constrained. Therefore, the BI of the future must address far more diversified data needs with a convergence of varied technologies specialised to different scenarios.
Just as the data source diversifies, the profile of BI users also changes significantly, leading to the second trend I’ve picked up on. The traditional user base was predominantly management decision-makers on all levels, supported by power users. The changing trend sees this profile being extended by adding a large operational consumer layer. Information is more pervasively consumed directly by operational systems to drive rule engines, enabling operational decisions as part of the workflow.
On the other extreme, increased sophistication in power users gives rise to a highly specialised community of data scientists needing advanced technologies such as big data, predictive and prescriptive analytics, and even machine learning for building operational intelligence into the operational systems. These data scientists are also the typical users that require far more self-service power in their BI tools.
This democratisation of data requires a paradigm shift that makes data central to IT and BI capability development in order to ensure the business intelligence competency centre’s (BICC’s) ability to effectively service the needs of the end-users. The traditional engagement model between business and IT has left BI orphaned and treated as an after-thought, but as BI becomes operationalised and more strategically relevant, data considerations must become central to the systems development life cycle.
All the available forms of BI exist for one reason only, and that is to meet actual business needs.
Finally, BI needs to deliver at the speed of decisions. Traditionally, daily, weekly, monthly data refreshes were sufficient, and business accepted a six to 18-month timeframe for delivering new capabilities. Today’s pace of business has increased significantly. Companies must be agile and adapt their tactics and strategies in-flight in order to remain competitive. The operational dependencies on BI, therefore, implies a dire need for faster refreshes (near real-time) and more agile and flexible delivery cycles.
The last, and potentially most disruptive, factor is increased regulation. This trend is especially common in the financial services industry, but impacts more industries (especially companies that have crossed into providing financial services as value-added services). Corporate regulations like King III have matured over the last decade, to the point where detailed regulations are more explicitly touching on management and the use of data and information to ensure reliable decision-making on a corporate governance level.
Besides corporate regulation Acts like POPI, the Amendment Bill to the ECT Act has significant implications to what companies may or may not do with their data, forcing BICCs to revisit their own methodologies, practices, and governance.
The implications on these common business and environmental trends lead to the answers; the main issues the BI ecosystem of the future must cater for:
* The rapid delivery of information supported by conventional BI capabilities, integrated with next-generation architectures, to include data discovery, data cataloguing, data integration, data virtualisation, advanced analytics and more;
* Underpinned by a more agile BI delivery that enables tangible business value through data science at the speed companies make decisions; and
* Carefully governing all components of the ecosystem in order to protect the quintessence of BI: the single version of the truth.
Inevitably, a highly complex ecosystem such as this requires conscious stewardship, starting with a well-rounded, robust and sustainable strategy, with the strategy becoming the driving factor of what a company’s BI should look like in the future.
A clear strategy empowers BI decision-makers to wade through the hype and identify the various innovative BI technologies that best suit their companies’ needs, and thereby become the creators of their own BI fate.
Source : IT Web