What’s the Business Value of AI? A Systems Engineer’s Take

Across four decades, I have worked as a systems engineer in the information technology (IT) industry designing, architecting, configuring computing systems and representing them to buyers and operations teams. 

I’ve learned to see it as the art of designing IT solutions that amplify human productivity, capability, and creativity. For these aspirations to be realized however, these solutions need to be reframed and translated into business value for acquisition and implementation. 

It’s a tricky proposition in this hypercompetitive world, which we’re seeing unfold in front of our eyes due to the current buzz around AI and Large Language Models (LLMs). The ‘arrival’ of AI onto the scene is really the delivery of the promise and aspirations of six decades of iterative effort.

However, its success – defined in terms of business value – is not a given. To understand this, let me first take you back to a technical article I came across early on in my career. “All machines are amplifiers,” it stated in a simple and direct manner. That statement was an epiphany for me. I’d considered amplifiers as just a unit in a stereo system stack or what you plugged your guitar into. 

Mind blown.

As I have pondered this realization across my career, I have come to consider IT as a collection of machines offering similar amplification, albeit on a much broader scale and with greater reach.

IT amplifies human productivity, capability, and creativity. It allows us to do things we could never do before and do them better and faster. It helps us solve complex problems and create new opportunities – for business and humanity.

To augment or to replace – THAT was the question

However, amplification is not an end in itself. In the 1960s, two government-funded research labs on opposite sides of the University of Berkeley Stanford campus pursued fundamentally different philosophies. One believed that powerful computing machines could substantially increase the power of the human mind. The other wanted to create a simulated human intelligence. 

These efforts are documented in John Markoff’s book, “What The Dormouse Said – How the Sixties Counterculture Shaped the Personal Computer Industry”.

One group worked to augment the human mind, the other to replace it. Whilst these two purposes, or models, are still relevant to computing today, augmenting the human mind proved to be the easier of the two to deliver – with a series of miniaturization steps culminating in the general consumer availability of the personal computer (PC) in the 1980s. PCs freed humans to be individually productive and creative, and changed how education and business were done around the globe. Humanity rocketed forward and has not looked back since.

Artificial Intelligence (AI) is now becoming commercially viable and available at our fingertips to replace the human mind. It is maturing rapidly, being implemented at breakneck speeds in multiple domains, and will revolutionize how computing is designed and deployed in every aspect from this point forward. While it came to fruition later than its 1960s sibling, its impact will be no less revolutionary with, perhaps, an end-state of intelligence that can operate itself.

Meanwhile, automation on the augmentation front has also rapidly advanced, enabling higher productivity and efficiencies for humans. It’s still a human world, but our cycles continue to be freed up for whatever purpose we can imagine or aspire to, be they business or personal endeavors.

Systems engineering – finding a path between trade-offs

From a high-level fundamental compute standpoint, that’s all there really is – augment or replace. Both models must be the starting point of any system we design. To deliver on the goal, we turn to systems engineering and design at a more detailed, complex, and nuanced level. 

The primary task has always been simple in concept – to move bits (bytes) of data into the processor registers where it can be operated upon. That is, get data as close to the processor as possible and keep it there for as long as practical. 

In practice this can be a surprisingly difficult and expensive proposition with a plethora of trade-offs. There are always trade-offs in IT. You can’t have it all.  Even if it were technically feasible and attainable you couldn’t afford it or certainly would not want to in almost every case. 

To accommodate this dilemma, at the lower levels of the stack, we’ve created a chain of different levels of various data storage and communications designed to feed our processors in as efficient and effective a manner as practical, enabling them to do the ‘work’ we request of them. 

For me, then, designing and engineering for purpose and fit is, in essence, simple. Firstly, am I solving for augmentation or replacement? Secondly, where’s the data, and how can I get it where it needs to be processed, governed, managed, and curated effectively? 

And one does not simply store, retrieve, manage, protect, move, or curate data. That stuff explodes in volume, variety, and velocity, as we are wont to say in this industry. These quantities are growing exponentially. Nor can we prune or curate it effectively, if at all, even if we wanted to. 

Applying principles to the business value of AI

All of which brings us back to the AI’s arrival on the scene. The potential for AI is huge, as we are seeing. From the systems engineer’s perspective however, AI requires a complete data set to enable the expected richness and depth of the response. If the dataset is incomplete, ipso facto, so is the response – and, thus, it could be viewed as bordering on useless in many instances. In addition AI algorithms can be exhaustive (and processor-intensive) or take advantage of trade-offs. 

This opens up a target-rich environment of problems for clever computer scientists and systems engineers to solve, and therein lies the possibilities, trade-offs, and associated costs that drive all decisions to be made and problems to be solved at every level of the architecture – user, application, algorithm, data, or infrastructure and communications.

AI has certainly ‘arrived’, although for the systems engineer, it’s more a continuation of a theme, or evolution, than something completely new. As the PC in the 1980s was the inflection point for the revolution of the augmentation case, so too is AI in the 2020s for the replacement case. 

It then follows, how are we to effectively leverage AI? We will need the right resources and capabilities in place (people, skills, tools, tech, money, et al) and the ability within the business to use the outputs it generates. It resolves to business maturity, operational models and transformational strategies.

Right now I see three things as lacking. From the provider perspective, AI platforms (and related data management) are still limited which means a substantial amount of DIY to get value out of them. I’m not talking about ChatGPT in itself, but, for example, how it integrates with other systems and data sets. Do you have the knowledge you need to bring AI into your architecture?

Second, operational models are not geared up to do AI with ease. AI doesn’t work out of the box beyond off-the-shelf models, however powerful they are. Data scientists, model engineers, data engineers, programmers, and operations staff all need to be in place and skilled up. Have you reviewed your resourcing and maturity levels?

Finally, and most importantly, is your organization geared up to benefit from AI? Suppose you learn a fantastic insight about your customers (such as the example of vegetarians being more likely to arrive at their flights on time), or you find out when and how your machinery will fail. Are you able to react accordingly as a business?

If the answer to any of these questions is lacking, then you can see an immediate source of inertia that will undermine business value or prevent it altogether. 

In thinking about AI, perhaps don’t think about AI… think about your organization’s ability to change and unlock AI’s value to your business.

The post What’s the Business Value of AI? A Systems Engineer’s Take appeared first on Gigaom.

Case Study: Ingram Micro

“GigaOm and Ingram Micro work with partners to drive strategic growth and deliver more value to technology buyers. GigaOm goes beyond technology to enable partners to better connect technology solutions with customers’ business operating models, people and process organizational maturity, and transformational aspirations.

“Before GigaOm, partners were challenged to access this level of go-to-market positioning and messaging, strategic roadmap advice, and sales enablement. Ingram Micro and partners have benefited tremendously from the great strategic advice, research, and sales enablement produced by GigaOm.”

–Karl Connolly, technologist and field CTO at Ingram Micro.

Context setting

Ingram Micro is one of the world’s largest distributors of IT systems and services, with operations in 61 countries and reaching nearly 90% of the world’s population. Ingram Micro works with 1,500 original equipment manufacturer (OEM) vendor partners and 170,000 technology solution provider customers across the cloud, AI, data, infrastructure, security, storage, and networking. Ingram Micro’s mission is to enable technology channel partners to accelerate growth, run better and more profitably, and deliver more value for their customers—business and technology decision-makers.

Ingram Micro turned to GigaOm to enable its reseller and OEM partners to drive strategic growth by connecting their technology solutions to the strategic operational business models, organizational maturity, and transformation goals of customer organizations.

Karl Connolly is technologist and field CTO at Ingram Micro. He says the relationship with GigaOm provides unique value to Ingram Micro’s channel-focused business.

“GigaOm understands customers’ motivations so they can help partners better connect their technology solutions to how customers run their business,” says Connolly. “Ingram Micro is a big brand that is recognized, but is one step removed from the end customer by nature of its channel-focused business model. Its capabilities are not necessarily understood or known to end customers and resellers—gaining the voice and trust of end customers always takes place via an Ingram Micro reseller.”

Figure 1. Karl Connolly, Technologist and Field CTO, Ingram Micro

The goal was to help end-user organizations make better-informed technology choices to enable their businesses, explains Connolly. “The list of choices of technology solutions available to a customer can be daunting, and with each vendor competing for said customer, informed decisions can be clouded by misrepresentation, ambiguity, and partner preference.”

Ingram Micro understood that to get a clearer picture of a customer’s business and operational model requires organizational people, process maturity, and transformational aspirations. A better understanding of how to engage with end-user organizations would positively impact Ingram Micro, vendor, and partner sales revenue.

Why GigaOm?

The company has worked with analyst firms before, including IDC, Gartner, and Forrester. However, as Ingram Micro developed its go-to-market approaches and channel sales strategies, it saw GigaOm’s business and technical practitioner-led approach, covering C-level leaders, architects, and engineers, as a major asset.

“GigaOm’s advisors are experienced strategists, practitioners, and engineers who have been IT buyers and consumers,” explains Connolly. “This unique perspective provides us with the ‘voice of the customer,’ allowing us to connect technology solutions to C-level, line of business, and architect business value-centric strategy based on customer’s operational business models, organizational maturity, and transformation goals, which has proven instrumental in shaping our go-to-market strategy.”

GigaOm’s unique position as a C-level, line of business, architect, and engineer practitioner-led advisory, research, and enablement company, along with its voice-of-the-customer understanding, enables it to present technology in a way end-user organizations can connect with.

“GigaOm helps us and our partners make the case to end users for adoption of a technology area based on brand or specific product by being the voice of the customer,” says Connolly. “The research and insights enable informed decisions based on experience, testing, and unbiased assessment from the end user perspective. We are afforded opinions, perspectives, and facts that aren’t attainable from other firms, or from end users directly.”

GigaOm sees research as a tool to enable stakeholders on all sides: Vendors and partners understand how to talk to customers better, and end-users become better equipped to decide between complex offerings. It was this flexible approach that brought Ingram Micro and GigaOm together.

Figure 2. GigaOm and Ingram Micro Engagement Model

Aspects of GigaOm’s offering align with Ingram Micro’s vision, strategy, and approach, not least GigaOm’s brand value and partnership approach. “GigaOm has built a good solid brand and has credibility, and the DNA of the company fits with ours on channel partners,” says Connolly.

“We particularly appreciate GigaOm’s strong connections with many of the OEMs supported by Ingram Micro. This synergy has further enhanced the value of their insights for our business,” says Connolly. “We and our partners can also generate revenue with GigaOm by reselling GigaOm research and services to end-user customers; that is a unique capability.”

Solution and Approach

The partnership between Ingram Micro and GigaOm has been directly targeted at relationship building and enabling the company to engage in more strategic conversations about technology solutions. To kick things off, GigaOm CTO Howard Holton presented to Ingram Micro solution architects covering strategic areas such as CAPEX to OPEX.

These customer-led perspectives helped solutions teams better understand how to connect with OEM vendors based on a balanced perspective of their offerings. “Having an unbiased expert in Howard is invaluable,” says Connolly. “Often, teams are informed by the vendor, which can be limiting.”

As a result of the engagement, the Ingram Micro team gained a firmer foundation for discussing solutions with vendors and solution providers, enabling them to better drive customer conversations.

In addition, GigaOm participated in several conversations across multiple service providers and other partner firms, including T-Mobile, Betacom, and Otava. The goal was to develop effective business and technical enablement and sales strategies with customers based on their industry vertical, with C-level, line of business, architect, and engineer positioning, messaging, benchmarking, market evaluation, and cost analysis services from C-level to engineer. “GigaOm’s openness and willingness to build custom engagements for the partner was very well received, with the right set of assets delivered,” says Connolly.

Benefits

Overall, GigaOm’s engagement with Ingram Micro enabled the company to hone its strategies and sales plays based on the language of the customer. “GigaOm’s service of advisory and validation is a step above what its peers provide. The value of the advisory, coming from the vantage point of one who has done it, is more compelling than that of one who has studied or read up on a subject,” says Connolly.

Not only this, but the interaction helped Ingram Micro identify new opportunities within its partner portfolio. A specific example is Betacom: “GigaOm was the reason Ingram Micro became aware of Betacom, which is becoming a strategic partner for Ingram Micro in the 5G and industrial manufacturing space,” says Connolly. Manufacturing and Operational Technology (OT) is undergoing rapid digital transformation and is a relatively new industry vertical for Ingram Micro—an unexpected benefit was from GigaOm’s Holton, who had previously worked and consulted at major industrial companies.

“Our advanced solutions team leveraged Howard’s operational business insight and strategic expertise to gain understanding of the manufacturing and OT buyer across real world insights, considerations, and buying process advice. This helped the team get informed and prepared for partner meetings and our MxD partnership.”

GigaOm’s research and insights have directly impacted growth for Ingram Micro and its partners, says Connolly. “GigaOm services enable driving increased share of wallet by promoting more of what Ingram Micro offers to its partners in product, services, and solutions.”

Connolly says this direct benefit emanates from several factors:

Thought leadership: “An advisory company like GigaOm that has a consumer community consisting of tech and business influencers is a good way to be seen as thought leaders. GigaOm can potentially help buyers understand their options as it pertains to a new concept, program, or technology to support their operating model and transformation goals.”

Credibility & people cost savings: “GigaOm can help less mature or cost-conscious partners gain consulting capabilities without the need to staff CIOs, field CTOs, architects, and engineers in-house that could cost upward of $1 million dollars, instantly providing credibility across a broad domain of markets, services, and solutions.”

Independence: “GigaOm can act as a symbiotic extension to a partner, as it has no desire or aspiration to become a VAR and is purely there to help the partner uncover more opportunity.”

Opportunity: “GigaOm insights enable partners and Ingram Micro to stay current on macro themes and the OEMs filling those spaces, which has tangential and heretofore unmeasured value.”

Content: “Candid feedback on some of the materials and presentations we gave has been helpful in shaping how future content can be crafted and delivered to better connect with customer’s business operating model, organizational maturity and transformation goals.”

Next Moves

Ingram Micro and GigaOm will continue to build on the current success with individual partners and customers, and Ingram Micro will use the GigaOm partnership to further its position with service providers. “GigaOm will aid our partners to better connect with customers, gain visibility and mindshare in the market, specifically as we engage with our MNO and private network provider partners who can use enablement research and advisory services from GigaOm to become better known as leaders in a specific area, such as private 5G or connected workers,” says Connolly.

And what about Ingram Micro? “Beyond additional sales opportunities and new partnerships, our relationship with GigaOm can inform our portfolio and the solutions we offer our partners, elevating us beyond the traditional role that distribution plays.”

The post Case Study: Ingram Micro appeared first on Gigaom.

Meeting Owl videoconference device used by govs is a security disaster

Meeting Owl videoconference device used by govs is a security disaster

Enlarge (credit: Owl Labs)

The Meeting Owl Pro is a videoconference device with an array of cameras and microphones that captures 360-degree video and audio and automatically focuses on whoever is speaking to make meetings more dynamic and inclusive. The consoles, which are slightly taller than an Amazon Alexa and bear the likeness of a tree owl, are widely used by state and local governments, colleges, and law firms.

A recently published security analysis has concluded the devices pose an unacceptable risk to the networks they connect to and the personal information of those who register and administer them. The litany of weaknesses includes:

  • The exposure of names, email addresses, IP addresses, and geographic locations of all Meeting Owl Pro users in an online database that can be accessed by anyone with knowledge of how the system works. This data can be exploited to map network topologies or socially engineer or dox employees.
  • The device provides anyone with access to it with the interprocess communication channel, or IPC, it uses to interact with other devices on the network. This information can be exploited by malicious insiders or hackers who exploit some of the vulnerabilities found during the analysis
  • Bluetooth functionality designed to extend the range of devices and provide remote control by default uses no passcode, making it possible for a hacker in proximity to control the devices. Even when a passcode is optionally set, the hacker can disable it without first having to supply it.
  • An access point mode that creates a new Wi-Fi SSID while using a separate SSID to stay connected to the organization network. By exploiting Wi-Fi or Bluetooth functionalities, an attacker can compromise the Meeting Owl Pro device and then use it as a rogue access point that infiltrates or exfiltrates data or malware into or out of the network.
  • Images of captured whiteboard sessions—which are supposed to be available only to meeting participants—could be downloaded by anyone with an understanding of how the system works.

Glaring vulnerabilities remain unpatched

Researchers from modzero, a Switzerland- and Germany-based security consultancy that performs penetration testing, reverse engineering, source-code analysis, and risk assessment for its clients, discovered the threats while conducting an analysis of videoconferencing solutions on behalf of an unnamed customer. The firm first contacted Meeting Owl-maker Owl Labs of Somerville, Massachusetts, in mid-January to privately report their findings. As of the time this post went live on Ars, none of the most glaring vulnerabilities had been fixed, leaving thousands of customer networks at risk.

Read 12 remaining paragraphs | Comments

No more dealer markups: Ford wants to move to online-only sales for EVs

Ford's electric F-150 Lighting (L), eTransit (M), and Mustang Mach-E (R) battery-electric vehicles have all been such successes that they're all sold out for the rest of the year. And that's prompting the company to rethink how it goes about the whole process.

Enlarge / Ford’s electric F-150 Lighting (L), eTransit (M), and Mustang Mach-E (R) battery-electric vehicles have all been such successes that they’re all sold out for the rest of the year. And that’s prompting the company to rethink how it goes about the whole process. (credit: Ford)

Few Americans enjoyed the car-buying process even before supply chain chaos, and the chip shortage led dealerships to mark up inventory by thousands of dollars. But buying a Ford electric vehicle might be a lot less painful in the future, if Ford CEO Jim Farley gets his way. On Wednesday, Farley said that he wants the company’s EVs to be sold online-only, with no dealer markups or other price negotiations, according to the Detroit Free Press.

“We’ve got to go to non-negotiated price. We’ve got to go to 100 percent online. There’s no inventory (at dealerships), it goes directly to the customer. And 100 percent remote pickup and delivery,” Farley said while speaking at a conference in New York.

One of Tesla’s most popular innovations was to eschew traditional dealerships and sell its products directly to customers. But traditional manufacturers like Ford are usually prohibited from selling their products directly to customers, a legacy of fears over vertical integration written into state laws during the early 20th century. As such, Ford’s franchised dealers will almost certainly still have a role to play.

Read 4 remaining paragraphs | Comments

BioWare reveals Dreadwolf as the next Dragon Age title

Get busy imagining this logo on a box.

Enlarge / Get busy imagining this logo on a box.

It has been nearly eight years since Dragon Age: Inquisition launched as the most recent full game in Bioware’s acclaimed RPG series and nearly four years since an unnamed sequel was first teased at the 2018 Game Awards. On Thursday, developer BioWare revealed an official title for that sequel—Dragon Age: Dreadwolf—and confirmed the game won’t be coming until 2023 at the earliest.

In a brief blog post, BioWare confirmed the new game will focus on antagonist Solas, the mysterious elven hedge mage who was introduced as the Dread Wolf in Inquisition. Solas was also central to that game’s 2015 Trespasser DLC and featured heavily in a four-minute Gamescom 2020 behind-the-scenes featurette on the game.

In its announcement, BioWare describes Solas as someone whose “motives are inscrutable and his methods sometimes questionable, earning him a reputation as something of a trickster deity—a player of dark and dangerous games.” The developer also insists that “if you’re new to Dragon Age, you have no need to worry about not having met our antagonist just yet. He’ll properly introduce himself when the time is right.”

Read 3 remaining paragraphs | Comments

Find the soul