2018 predictions: retail

This January, we continue our tradition of publishing predictions from our experts for the year ahead, starting with retail.

Here’s what Bhavesh Unadkat, Principal Consultant of Retail Customer Engagement, told me:

“With customers demanding a deeper emotional connection and more personal experience with brands, we can expect to see retailers revaluating their business models to take a more ‘human-centred’ approach during 2018. Over the past year, we’ve seen a number of brands enjoying success from service-led approaches, including Deliveroo with its delivery service, and Cornerstone with its subscription model. The market could change significantly in the next 12 months as retailers try to deliver a better service driven approach to retailing.  These opportunities will be further imploded by the impact of connected devices becoming more mainstream.

“Innovation has been another big focus area for retailers in 2017. Many have either set up innovation capability or partnered with someone to do so; this is key to leverage emerging technology and have a space to ideate, test and learn to create experiences that provide a differentiator.  Many retailers are comfortable working with new and niche technologies and solutions to break through some of the challenges being faced in retail.  The key challenge I see here is how retailers can take this from a pilot or trial into something which can be scaled across their whole portfolio.

“As these technologies evolve and costs decrease, we will see the retail sector experimenting with new ways of engaging the customer and offering a personalised experience. This could in-turn see the revival of the physical store. In 2017, Amazon’s acquisition of Whole Foods sent shockwaves through the industry – suddenly, the notion that brick and mortar stores are dying was challenged. With the various uses cases for technologies like AI and AR to improve the customer experience, we can expect other retailers to re-evaluate brick and mortar stores as vehicles for driving loyalty in 2018. The store of the future will be based around personalised customer engagement, immersive experiences and selling not just products, but entire lifestyles.”

For more predictions from our wider retail team, read this blog post.

MyLiFi is a smart lamp that beams broadband to your laptop

Oledcomm is debuting one of the more unusual CES 2018 products today with the launch of MyLiFi. The French company uses light from a smart lamp to beam Internet broadband to your laptop.

It promises a speed of 23 megabits a second, or faster than any Wi-Fi radio connection. MyLiFi is a light-emitting diode (LED) lamp provides wireless Internet access through light in a secure way, the company said. The company will demonstrate the product at CES 2018, the big tech trade show in Las Vegas next week. It plans to launch a Kickstarter campaign in January and will set preorder prices at $500 or €499.

LiFi is short for Light Fidelity (a reference to wireless fidelity in Wi-Fi), and it allows mobile devices to connect via LED lights. It transmits data by modulating the light signals from an LED light bulb. The process is invisible to the human eye, and thousand of people are already using it in France, where the average Wi-Fi speed is about 10 megabits a second.

Light signals are received and converted into data by a dongle connected to the device, and the dongle has to be in the line of sight of the lamp.

Suat Topsu, a French researcher, created the technology as part of a research team in 2005. He started the Vélizy, France-based company in 2012, and it now has 30 employees. Benjamin Azoulay, who previously worked on the launch of the Philips Hue, is now CEO and Topsu is president.

The technology was previously used for business-to-business purposes, such as helping blind people navigate public transport systems, transmitting medical information in hospitals, and measuring travel times in supermarket aisles. The consumer version (designed by EliumStudio) is designed so you can point it in any direction, and the color temperature can be adjusted from cold white to warm orange to enable different kinds of lighting effects. The cable for the lamp connects to an Ethernet port in a router, and the dongle connects via a USB port to a computer.

“Just as clean energies are displacing fossil fuels and propelling us towards a world of responsible innovation, light is now replacing radio waves to provide a safe, a people- and eco-friendly internet connection,” said Azoulay, in a statement. “MyLiFi marks the start of a new era in connectivity.”

The 800 lumens MyLiFi uses only 13.5 watts of energy, as compared to 20 watts for a Wi-Fi router. You can control it using a web or smartphone app, changing the light intensity or setting lighting atmosphere and log-off times. Oledcomm said gamers can use the connection for a fast and reliable “ping,” or faster interaction in online games. Oledcomm is also pitching it to the business market, where companies can use it in public settings.

Moreover, connections are much more secure: LiFi signals use visible light and cannot pass through walls, making it much harder to hack a company’s Internet connection without attacking its firewall. And Oledcomm notes that MyLiFi offers an alternative to radio and electromagnetic waves, which may be harmful to humans. Upload speed is 10 megabits a second, while download speed is 23 megabits per second. The light weighs about 14 pounds.

This article was written by Dean Takahashi from VentureBeat and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Unpredictions – what won’t happen with artificial intelligence

Artificial intelligence promises many things, however some of these claims are over-exaggerated and hence not likely to happen in the short term. Conversica Chief Scientist Dr. Sid J. Reddy gives readers the low down.

Artificial intelligence and machine learning are two of the key tools for the digital transformation of many businesses. From Amazon Alexa to autonomous vehicles, artificial intelligence is progressing at a very fast rate. However, there remain many technological limitations in terms of what machine intelligence technology can deliver in the short-term.

The company Conversica is a leader in conversational artificial intelligence for business, and Conversica Chief Scientist Dr. Sid J. Reddy has shared with Digital Journal readers four things are unlikely to happen with artificial intelligence during 2018. Dr. Reddy refers to these as “unpredictions”, turning the common approach for analyst’s to make predictions on its head.

Self-driving cars

The first area is that “we won’t be riding in self-driving cars“. As Dr. Reddy explains: “While many are predicting a driverless future, we’re a long “road” away from autonomous vehicles.” This is is terms of cars that will take commuters to work, a situation where the commuter can sit back and read his or her iPad while paying little attention to the traffic outside. He adds: “For a number of years ahead, human operators and oversight will still rule the roads, because the discrete human judgments that are essential while driving will still require a person with all of his or her faculties–and the attendant liability for when mistakes happen. Besides technical challenges, humans tend to be more forgiving about mistakes made by human intelligence as opposed to those made by artificial intelligence.”

Disappearing jobs

The second ‘unprediction’ is that people will not be replaced by AI bots this year. Dr. Reddy states: “While it is possible that artificial intelligence agents might replace (but more likely supplement) certain administrative tasks, the reality is that worker displacement by AI is over-hyped and unlikely.” So robots won’t be taking over most jobs any time soon. This is because, the analyst states: “Even in an environment where Automated Machine Learning is helping machines to build machines through deep learning, the really complex aspects of jobs will not be replaced. Thus, while AI will help automate various tasks that mostly we don’t want to do anyway, we’ll still need the human knowledge workers for thinking, judgment and creativity. But, routine tasks beware: AI is coming for you!”

Medical diagnosis

The third aspect is that we won’t get AI-powered medical diagnoses. This is, Dr. Reddy says “Due to a lack of training data and continued challenges around learning diagnosis and prognosis decision-making through identifying patterns, AI algorithms are not very good at medical decision automation and will only be used on a limited basis to support but not replace diagnosis and treatment recommendations by humans.” He adds: “AI will be increasingly deployed against sporadic research needs in the medical arena, but, as with fraud detection, pattern recognition by machines only goes so far, and human insight, ingenuity and judgment come into play. People are still better than machines at learning patterns and developing intuition about new approaches.” Importantly: “People are still better than machines at learning patterns and developing intuition about new approaches.”

AI at work

The fourth and final area is that we will still struggle with determining where artificial intelligence should be deployed. Dr. Reddy states: “Despite what you might be hearing from AI solution vendors, businesses that want to adopt AI must first conduct a careful needs assessment. As part of this process, companies also must gain a realistic view of what benefits are being sought and how AI can be strategically deployed for maximum benefit.” The analyst adds: “IT management, business users and developers should avoid being overly ambitious and carefully assess the infrastructure and data required to drive value from AI. Best practices and “buy versus build” analysis also should be part of the conversations about implementing AI applications.” While artificial intelligence promises many things, the message from Dr. Reddy is that we need to be realistic and more circumspect with our short-term expectations as to what the technology can deliver.


The views expressed in content distributed by Newstex and its re-distributors (collectively, “Newstex Authoritative Content”) are solely those of the respective author(s) and not necessarily the views of Newstex et al. It is provided as general information only on an “AS IS” basis, without warranties and conferring no rights, which should not be relied upon as professional advice. Newstex et al. make no claims, promises or guarantees regarding its accuracy or completeness, nor as to the quality of the opinions and commentary contained therein.

This article was from Digital Journal and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Project to double range of electric vehicles with new batteries

A new British project sets out to double range of electric vehicle with the use of silicon batteries. This £10 million scheme is led by the startup Nexeon.

The goal of U.K. startup Nexeon is to develop new silicon materials for Li-ion batteries. The aim is for these to provide electric vehicles with a range of over 400 miles, which is around double what is currently possible with more conventional forms of energy storage devices.

The parent project is called SUNRISE, an acronym that relates to the different companies involved with the project (Synthomer, UCL & Nexeon’s Rapid Improvement in the Storage of Energy). The focus is to use silicon as a replacement for carbon in a battery cell anode. The project is also concerned with optimizing battery cell designs for a range of automotive applications.

The part played by Nexeon is to commercialize a silicon anode. The parts played by the other companies are: Synthomer leads on the development of a next generation polymer binder optimized to work with silicon to ensure anode/binder cohesion during a lifetime of charges. The third player is from academia – University College London (UCL), where researchers will lead the work on material characteristic and overall battery cell performance. The reason silicon is being considered as a replacement for carbon in battery anodes is to increase the energy storage potential.

One thing hampering such development is due to expansion, which occurs when cells are charged and discharged. Project SUNRISE aims to address the silicon expansion issue. This through enabling more silicon to be used, which raises the energy density that can be achieved in the cell. Commenting on the process, Dr Scott Brown, who is the CEO of Nexeon, told EE News Europe: “The biggest problems facing EVs – range anxiety, cost, charge time or charging station availability – are almost all related to limitations of the batteries.”

The researcher added: “Silicon anodes are now well established on the technology road maps of major automotive OEMs and cell makers, and Nexeon has received support from UK and global OEMs, several of whom will be involved in this project as it develops.” Funding for the project has been provided by Innovate UK, as part of the Faraday Battery Challenge.

The core ambition of the program is to make the U.K. “the go-to place for the research, development, manufacture and production of novel battery technologies for both the automotive and the wider relevant sectors.” There is also a project for schools, aimed at teaching students about batteries and energy, as shown in the video below:


The views expressed in content distributed by Newstex and its re-distributors (collectively, “Newstex Authoritative Content”) are solely those of the respective author(s) and not necessarily the views of Newstex et al. It is provided as general information only on an “AS IS” basis, without warranties and conferring no rights, which should not be relied upon as professional advice. Newstex et al. make no claims, promises or guarantees regarding its accuracy or completeness, nor as to the quality of the opinions and commentary contained therein.

This article was from Digital Journal and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

2018 predictions: financial services

Following predictions for retailers, today we look at the financial services sector.

Open Banking and APIs

Gaining significant momentum across the globe – businesses, including banks, are getting better at adopting an “open” model and honing it into the industry. Thanks to increasing financial data transparency, third party and partner APIs are allowing financial service companies and banks to take innovative ideas to market quickly and cheaply.

It is still in its infancy, in need of strict regulations and security measures, but it’s looking very positive. Businesses are monetising data and building new revenue streams.  Those that don’t think strategically and establish themselves within open banking will run the risk of being disintermediated from their customers.

Maturing blockchain technology

We’ve seen great progress this year on blockchain technology. While perhaps not as quickly as we had anticipated at the beginning of 2017, we’re now seeing some very successful pilots being deployed across the financial services industry. I expect lots more will come in the next 6-9 months as the maturity of blockchain as both a solution and opportunity grows.

VR becoming more of a reality

It is clear that technology is affecting financial services in a multitude of ways. In the next year, we’ll see AI and VR becoming real opportunities to engage customers with a fully immersive and interactive experience, such as providing customers with virtual banks. We’re already seeing less bank branches opening and this will continue to decline as VR/AI come to play.

Branch-free banking

Mobile banking will continue to lead in the next year. Only 18 months ago, few-to-no banking transactions made were on mobile, by 2020 we predict this figure will be around 90%.

More noise from the fintechs

Traditionally, market entrants found it difficult to break crack the financial services industry. Now, conversations around fintechs have vastly matured over the last year. Typically honing in on one specific service, they are able to refine and perfect it. Consumers are now realizing what is on offer and customer service expectations are increasing. New entrants like Monzo have shown consumers what good banking services could like – the business is legacy free and innovatively handles user experience through technologies such as chatbots and biometric authentication.  Be prepared to see the financial services industry become far more technologically competitive over the next twelve months. Competition will be particularly fierce with fintechs creating new marketplaces through their unique service offerings.

6 Must-Have Ingredients for Effective Virtual Teams: A Recipe for Success

Virtual teams are becoming the new fact of life for businesses all over the globe. However, while virtual teams are being used ever more frequently, that doesn’t mean that they’re being utilized or managed properly. In fact, OnPoint has found that many organizations try to apply the same basic guidelines and best practices they use for their co-located teams—which, unfortunately, doesn’t work very well.

To help businesses across the globe get more from their investments in virtual collaboration, OnPoint conducted a study of dozens of virtual teams to identify and understand the success factors that separate top-performing virtual teams from the less successful ones. Of the virtual teams surveyed, it was found that a full 27% of those teams were not “fully performing” up to their expected standards.

Why were these virtual teams failing? In the course of answering this question, OnPoint Consulting discovered six different “must-have” ingredients for facilitating effective virtual teamwork that form the recipe for virtual team success:

1: A Focus on People Issues

One of the basic issues with a virtual team is the lack of human contact. This lack of contact can result in a disassociation between team members that exacerbates any conflicts that arise. These interpersonal issues can contribute to an “us vs. them” mentality between team members in different locations or subgroups. The end result is a fractured team that fails to collaborate with one another—dragging down efficiency and productivity.

Some corrective actions you can take include:

  • Developing a team page or adopting a collaboration software that allows virtual team members to share information and get to know one another.
  • Creating methods for team members to interact and communicate effectively.
  • Building a collective resource bank to share experiences.
  • Finding ways to spotlight team members and recognize their contributions or celebrate successes.
  • Sending newsletters and updates to the team.
  • Partnering team members from different locations on projects and rotate partnerships so everyone ends up working closely with each team member at some point.

2: Trust Building

Trust is a major requirement for any team to function effectively. On a virtual team, it becomes even more crucial for success. In virtual teams, however, trust seems to develop more readily at the task level than it does on the interpersonal level—most likely because of the lack of personal contact.

Some warning signs that trust is lacking on a team include team members:

  1. Not referring to themselves as a part of a group—not saying “we” when talking about themselves or the team.
  2. Not appearing to know one another very well—showing no interest in the welfare and personal situations of others on the team.
  3. Are openly negative—both about the contributions of other team members and the general direction of the team’s work.
  4. Do not regard the commitments and assertions of others as credible—they may mock or discredit others who make promises.

If conditions of low trust exist, some effective countermeasures include:

  • Holding face-to-face team meetings at least once early on in the team’s formation.
  • Giving team members a place to hold truly “open” communication.
  • Making team members feel empowered to make and act on decisions.
  • Resolving conflicts that arise rather than avoiding them.
  • Making sure that the team leader models these positive behaviors and reinforces them in the team.

3: Training in “Soft” Skills

So-called “soft” skills—those skills that facilitate effective interpersonal interactions—can make an enormous impact on a virtual team’s performance. Throughout its research, OnPoint has found that virtual teams who have been through team building and interpersonal skills training consistently outperform teams that have not undergone such training.

However, all too many organizations do not invest in such training, despite the strong link between interpersonal skills and team performance. Some things that can be done to help make sure virtual teams have strong interpersonal skills include:

  • Using assessments that emphasize soft skills when selecting or vetting virtual team members.
  • Holding team-building sessions—preferably conducted at one of the face-to-face team meetings—to help team members get to know each other personally and strengthen working relationships.
  • Assessing development needs for team members and team leaders. Then, conducting skill building exercises focused on the areas for improvement that are identified.
  • Periodically reassessing needs over time.

4: Identifying and Resolving “Performance Peaks”

Virtual teams that have worked together for a while tend to be more successful than teams that have only recently formed. However, many of these long-standing teams face a performance peak right around the one-year mark. High-performing teams can avoid a major decline in productivity following the performance peak by implementing a few specific strategies—lower-performing teams struggle to overcome these peaks because they lack an appropriate strategy.

So, if you see the warning signs of a performance peak—such as team members getting along, but not producing results, an apparent lack of team direction, and team members not committing adequate time to the team—it may be time to take one or more of the following corrective actions:

  • Clearly define team roles and accountabilities to minimize potential frustrations and misunderstandings that derail productivity and damage morale.
  • Regularly review team processes.
  • Periodically examine the level of the team’s performance—also collect feedback from various stakeholders to assess team performance.
  • Based on the outcomes of the assessments, identify barriers to high performance and the steps that can be taken to overcome/remove these barriers.

5: Creating a “High Touch” Environment

Advances in telecommunications technology have made it easier than ever to work on a virtual team, but it still isn’t a perfect replacement for direct human interaction. The inability to replicate a high touch environment—one where people can tap each other on the shoulder and get that interpersonal interaction—remains one of the greatest performance barriers on virtual teams.

Face-to-face meetings may require time and expense for virtual teams, but making the investment to hold such meetings once or twice a year can help the team perform better overall by providing a high touch environment.

Some warning signs of a lack of a high touch environment include poor communication, a lack of engagement, and a lack of attention during virtual meetings. Some ways to remedy the loss of a high touch environment include:

  • Leveraging synchronous tools (e.g., Instant Messaging) to increase spontaneous communication.
  • Using tools such as electronic bullet/message boards to create a sense of shared space.
  • Carefully choosing communication technologies that are most appropriate to the specific task at hand.
  • Developing a communication strategy—but re-examining these processes over time.
  • Making wider use of videoconferencing to simulate face-to-face communication.

6: Effective Virtual Team Leadership

Leadership remains the most important factor for the success of virtual teams. OnPoint’s study, as well as other research, shows that effective leadership has a statistically significant correlation with higher performance on virtual teams. To be effective, team leaders in a virtual environment must be especially sensitive to interpersonal, communication, and cultural factors so they are able to overcome the limitations imposed by distance.

Some warning signs of an ineffective virtual team leader include:

  1. The team consistently failing to meet performance objectives—and deliverables are often late or of poor quality.
  2. Relationships between team members and the virtual team leader are characterized by stress, delayed communication, and best described as “dysfunctional.”
  3. The leader is unclear about the team’s direction, purpose, and objectives.
  4. The team leader pays more attention to co-located team members than to remote team members—or team members who are their “favorites.”

The best way for organizations to avoid this particular performance problem is to select team leaders not only based on technical skills, but on their soft skills as well. In a virtual team environment, soft skills can have a major impact on team performance.

If you’re a team leader on a virtual team, it’s important to make a self-assessment to determine whether you might be the cause of poor performance on the team. This can be a hard thing to confront, but it is necessary. However, should you find that you, as a team leader, are facing this particular performance barrier, there are a few things you can do:

  • Set clear goals and direction for the team, revisiting them as priorities shift.
  • Engage team members in development of team strategy.
  • Provide time for team building and coordinate periodic face-to-face meetings.
  • Find ways to ensure that team members feel included.
  • Provide timely feedback to team members; be responsive and accessible.
  • Emphasize common interests and values and reinforce cooperation and trust.
  • Create a system to easily integrate new team members.
  • Teach the importance of conflict resolution.
  • Celebrate team achievements and successes.

Organizations that “get it right” know that there are stark differences between their virtual teams and their co-located teams. Unfortunately, all too many organizations have yet to effectively act on this critical insight.

There are many well-intentioned companies that have failed or have been harmed because of the tendency to treat virtual teams and co-located teams in the same way. Worse yet is when virtual teams are started on a whim without the necessary planning or follow-up—this is never a recipe for success.

However, with better planning and follow-up, organizations can dramatically improve the success of their virtual teams—achieving a better ROI for their investment.



This article originally appeared in 21st Century Leadership Insights.

 

This article was written by Darleen DeRosa from Business2Community and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Intel teams up with Paramount Pictures to explore VR films

Intel is partnering with Paramount Pictures to explore the cinematic applications of its volumetric video technology, which it’s previously used to livestream and record virtual reality sports matches. The announcement comes after the company finished construction on Intel Studios, a 25,000 square foot volumetric video capture stage in Los Angeles.

At its keynote address ahead of the Consumer Electronics Show, Intel CEO Brian Krzanich showed how viewers can experience a VR film from various perspectives. A short snippet from a Western film served as an example, zooming from camera to camera as a couple of cowboys brawl outside an old saloon — and then showing what the scene looks like from the perspective of the horse.

After the demo, Krzanich invited Paramount CEO Jim Gianopulos on stage to comment on what movies might look like with volumetric video capture.

“When you give filmmakers tools like this, their imaginations are unfettered,” said Gianopulos. “You get some of the most gifted and talented filmmakers in the world, we can only imagine what they can do with this technology.”

Intel will also be filming the Winter Olympics in Pyeongchang, South Korea next month with its volumetric video technology, with some events to be livestreamed and others available on demand.

This article was written by Stephanie Chan from VentureBeat and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Intel says its new palm-sized PC is the smallest VR-ready computer ever

To showcase its latest 8th gen Core i7 processor at CES 2018, Intel has unveiled two flavors of a new NUC that fits in the palm of your hand and is said to be capable of running VR experiences and games.

That means powerful graphics performance in a tiny package that’s easy to carry around and use at your desk. Intel began showing off Next Unit of Computing (NUC) devices a couple of years ago, but this one, equipped with a mighty new chip, finally makes the concept truly worth taking notice of.

Credit: IntelIntel’s new NUC packs plenty of power into a palm-sized package.

The two models, the NUC8i7HVK and NUC8i7HNK, feature Intel’s new Core i7 chips that are coupled with AMD RX Vega M GPUs, negating the need for bulky graphics cards. The HVK model comes with AMD’s Vega M GH hardware, which has more compute units, a higher clock speed, and support for overclocking. The HNK is bundled with the lower-end Vega GL chip, which isn’t quite as fast, but works with Core i5 as well as i7 processors, and should be a bit cheaper.

Intel says that the Vega GH should deliver significantly better gaming performance when compared to Nvidia’s GTX 1060 cards, while the GL can trump a 1050. Engadget notes that this means at least 60fps graphics at 1080p resolution with games like DOOM; that doesn’t exactly translate into VR-readiness, but it’ll be interesting to see how these machines perform in benchmark tests when they’re closer to hitting store shelves in the spring.

You also get a bunch of ports for Thunderbolt 3 USB-C, gigabit Ethernet, Mini DisplayPort, USB 3.1, HDMI, and 3.5mm audio ports. Intel says its NUC can support up to six displays simultaneously. It’s worth noting that this is primarily aimed at enthusiasts: you’ll have to add your own storage, RAM, and other peripherals before you can start playing with it. If the price is right (Intel has revealed details yet), this could usher in a new era of gaming PCs starting later this year.

This article was written by Abhimanyu Ghoshal from The Next Web and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Big Data: Amazon, Google, Microsoft, the Cloud…and other 2018 trends

In his latest book, “Thank You For Being Late,” Thomas Friedman highlighted 2007 as one of the most pivotal years in technology: 2007 saw the birth of Hadoop, the iPhone and Amazon’s Kindle for instance.  My bet is that 2018 will start a new era for the technology space, very much like 2007 did.  I see two reasons for this: the cloud will change the game because it has become a measurable strategy for Microsoft, Google and Amazon.  Big Data will be less about technology.  It will be more about management practices and processes.  And data engineers will be forced to rethink their world as the enterprise mindset for analytics excellence shifts.

There is ‘a catch’ embedded in each of these predictions however.   Let me try to convince you of my beliefs…and let me know if I don’t…

The Cloud Fallacy

You’ve heard it before: the Cloud is huge.  Microsoft, Amazon and Google have built multi-billion dollar businesses because enterprise CIOs are betting big on it.  Just a few years ago, enterprises confessed they were ‘dabbling’ in it.  This year, they are ‘all in”.  I recently invited to attend the banking technology conference where executives of a large multinational bank spelled out the change when they told me: “The Cloud and nothing else”.

I wrote last year about how the Cloud model was disrupting the economics and the deployment requirements of on-premises vendors.   I explained that many data management vendors would struggle to adapt to the Cloud and support their prospects’ evolving needs.  I called it the “Cloud fallacy”.

This year, I want to bring your attention to the risk of the “Cloud Lock-In”.  We’ve seen this movie before: in the old enterprise data management days, vendors that had started out selling database products, progressively moved up the stack, acquired or built applications to keep people engaged with their data warehouse and ultimately ‘locked’ customers into their suites.  While CIOs liked the fact that they had one vendor to go to in case of issues (the “one throat to choke” theory), they also suffered from the fact they had one vendor to go for licensing.  What was convenient at first, created less flexibility and leverage for CIOs.

If you missed this “lock-in” trend, look at Oracle’s acquisitions over the past decade and you’ll see what I mean…

Why this matters: as you start embarking on the “all cloud” journey, beware of the “lock-in”.  Not every cloud is created equal.  Amazon’s cloud has different options than that of Microsoft or Google.  Each integrates differently with the rest of your environment.  Most integrate best with their own stack.  You will want to architect your technical environment in a flexible manner.  Learn from the last 2 decades:  you might likely want to switch from one vendor to the next over time so you’ll want to make sure that the work you’re doing for one cloud is portable across all other applications.  One of our customers moved from a traditional data warehouse to a large on-premises Hadoop cluster, then to the Google Cloud in less than 9 months.  That’s less time than it used to take to build a data warehouse in 2008.  So, in 2018 and onwards, educate yourself around the concept of semantic layers and don’t get locked-in!

Big Data, Big Schmdata

In 2008, Hip-hop band The Black Eyed Peas came up with catchy tunes that urged their fans to get ahead of their times.  One of the lyrics that stuck with me was Fergy’s claim: “don’t be 2000 and late”.  

When Big Data became big in 2008, enterprises started to hire data scientists and data engineers.  In 2009, Hal Varian, chief economist at Google, claimed the sexy job in the next 10 years would be statistician.  Enterprises, for fear of being “2000 and late” onboarded scientists galore and instructed to “start coding”. Some the applications that came out of that work were cool.  But many CIOs were left with poor results: hiring tens, hundreds or sometimes even thousands of scientists to support the ever-accelerating data needs of enterprise employees, simply didn’t scale.

Industry analysts such as Gartner or Forrester will tell you that machine learning and artificial intelligence will come to save the day.  That’s all well and good.  But, when I meet with some of my most customers’ successful chief data officers, the first thing they talk to me, is NOT technology.  It’s organization structure and the mindset change required to win.  Forward thinking CDOs talk about their journey to “Insights-as-a-service” and how they are building the “Amazon of Big Data Analytics” inside their corporate walls.

Why this matters:  I learned about the concept of the “Amazon of Big Data Analytics” from Joseph DeSantos, VP Data Analytics at TDBank, and one of our customers.  The way he articulated his vision for the center of analytics excellence was quite contrarian.  He explained that, in the past, the enterprise I.T. group’s raison de vivre was to “full-serve” the business.  This meant that I.T. hired staff to lay out data infrastructure, build datamarts, and deliver fully baked reports and dashboards to business people, based on their requirements.  The “self-service” data analytics area saw business users take control of the construction of reports, even sometimes the building of datamarts, data extraction and transformation tasks that were traditionally owned by I.T.  This threw the enterprise for a loop.  While CIOs liked to enable the business, they realized quickly that enabling the business with tasks they weren’t fully trained to do and data they weren’t technically entitled to see, meant all hell broke loose.   If you think about I.T. as the owner of a restaurant, “self-service” was the equivalent of inviting guests into the kitchen and letting them use professional-grade appliances…

What CDOs have now realized is that they need a better way to enable the business and protect the enterprise.  They want to hire people who think of themselves as ‘educators’ of the business.  They want to build systems, processes and organizational structures that let employees gain agility and freedom, while the employer firm keeps control over their governance framework.  In “restaurant speak” this would be the equivalent of a “salad bar” I suppose.

So, while you’ll hear a lot about Machine Learning, Artificial Intelligence and other cool trends, I suggest you keep a eye on IaaS (Insights-as-a-Service). According to Forrester, that market will double, with 80% of firms relying on insights service providers for some portion of insights capabilities in 2018.  All technologies are welcome but they need to be deployed as part of the right people-structure and they need to be applied for the right use case…

This article was written by Bruno Aziza from Forbes and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Don’t Let Your Team Fall Into These 3 Goal Setting Traps

here’s nothing worse than working on a goal that isn’t yours. In fact, doing so is akin to working toward somebody else’s fitness goals of losing weight, but what you really want to do is build muscle. Yet that’s exactly what happens more often than not in many teams I encounter today. Here are three conditions that I see teams face with regard to goal setting:

Condition #1: The goal belongs to somebody else.

This is when the goal is the idea of one brainchild—or a select few—who ends up doing most of the work. This happens for myriad reasons, but the most common reason I see is a simple lack of trust within the team. There’s little trust between team members to accomplish the goal, which leads to greater micromanagement by the “brainchild” and stifles trust bonds even further. The bottom line: the goal is challenging and inspiring enough for members to want to pursue it, but it’s conceptualized by somebody else and that person doesn’t want to share. This isn’t teamwork, it’s a small group project.

Condition #2: The goal is given to you.

This is when a leader or manager tells you what your team’s goals should be without any situational understanding. Or, even worse, you’re told to come up with a marketing plan yet work in sales (this is an exaggeration obviously but the point is that a goal is given to you by somebody who lacks context). Think of goals that are given to you as the office parties that fall into the “mandatory fun” category. You’ll achieve them, but you won’t overachieve. This isn’t how a team sets goals, it’s how you allocate tasks on a to-do list.

Condition #3: The goal is a test.

This is when the goal already exists and the exercise of “goal setting” is really just a smokescreen to gain buy-in from members. These are goals that a supervisor or manager knows will have to be achieved but unsure how it’ll be received within the team, so they sugarcoat it by giving the team more influence in shaping it than they actually have. This isn’t goal setting, it’s manipulation.

If team goal setting is more burdensome than beneficial then it’s time to reevaluate if you’re a real team or a group of individuals occupying the same space. Setting goals as a team should be fun, informative and produce real results. They should be just out of the team’s comfort zone yet not overwhelming. Most important, they should be a reason to celebrate if achieved (and who doesn’t like a reason to celebrate?).

 

This article was written by Jeff Boss from Forbes and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Netatmo’s Facebook chatbot lets you interact with your smart home

Netatmo has a new way to communicate with smart home devices: a Facebook chatbot that will take your orders and answer your questions.

The Smart Home Bot sits between you, the user, and the collection of Netatmo and compatible devices in your home. The questions it can answer and the orders it will follow depend on the devices you have.

Netatmo says you can ask questions like, “What is the current weather like?” or “Who’s at home?”

The former ties into the company’s weather station, while the latter ties into the face detection system in Netatmo’s home security cameras.

Instructions can also be issued, such as, “Turn on the lights” and “Set the bedroom temperature to 23°C,” the company said.

You’ll need Netatmo’s smart thermometer or radiator valves for the latter command and those aren’t sold in the U.S.

So some of the new chatbot’s usefulness won’t be seen in the U.S., but it could still be a handy way to interact with your smart home, especially if you use Facebook Messenger a lot.

To get started, just fire up Facebook Messenger and start chatting with “Netatmo Smart Home Bot.” The service is free and a beta of the English-language version is already running. Other languages are promised to follow.

This article was written by Martyn Williams from TechHive and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Going digital: Six step process for business

There are different strategies appropriate for different business undergoing digital transformation. A review by IT Portal outlines six steps that businesses can consider for the transformative process.

The six steps have been presented by IT analyst Matthew Tharp, drawing on experiences of General Electric and Tokio Marine. Central to many of the processes described is the relationship between the overall business and the Information Technology department. This reinforces the paradigm that digital transformation is not the responsibility of one department; like quality assurance, it needs to be the responsibility of all.

Other road maps for digital transformation, pulling out equivalent themes, are laid out by the technology analysts E-consultancy and CIO.The six steps called out, and discussed below, are:Step 1: Map out the critical business processesMapping out the process is essential for constructing a digital narrative, giving the foundation to cope with rapid change and changing customer expectations. Before a business becomes digital or goes through major change, it is important that business processes are identified and documented. The process has an added benefit in that it allows different departments to interact and the importance of the ‘go digital’ message to be understood.Step 2: Change management

Many organizations are ill-prepared for digital strategies. This is because change management has not been executed properly, with risks assessed and all parries on board and clear about their roles and responsibilities. To accelerate digital transformation through change management there are four pillars: Implementation, Adoption, Alignment and Change.

Untitled

Step 3: Out-of-the-box capabilitiesBusinesses seeking improved agility in terms of speeding up the digital transformation process should avoid software that needs to be specially developed and instead use technology that can be implemented rapidly, such as platforms with out-of-the-box features and capabilities.Step 4: Using AIThe use of artificial intelligence can assist with data-backed decision making and allows for more personalized customer experiences. Such systems can also give insights into bottlenecks and navigate more successful paths through a process. In addition, machine learning can be used to measure business results faster with real-time, predictive analytics.Step 5: Unified technology

Ensuring that each area of the business is using the same or equivalent (compatible) technology makes the implementation process smoother and faster.

Office workers viewing a presentation (photo at BPL Elstree Jan 2018)

Step 6: Use of low-code technologyMany technology platforms use “low-code” methodology, which enables application development and automation without the need to do programming. This makes providing new services faster and reduces the demand on specialist services.Although each path to digital transformation differs, the six steps provided by Tharp can be reviewed by businesses as a potential road map for adapting and developing.


The views expressed in content distributed by Newstex and its re-distributors (collectively, “Newstex Authoritative Content”) are solely those of the respective author(s) and not necessarily the views of Newstex et al. It is provided as general information only on an “AS IS” basis, without warranties and conferring no rights, which should not be relied upon as professional advice. Newstex et al. make no claims, promises or guarantees regarding its accuracy or completeness, nor as to the quality of the opinions and commentary contained therein.

This article was from Digital Journal and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter