What’s the Score?

Why have we been making changes to the GigaOm Key Criteria and Radar reports?

We are committed to a rigorous, defensible, consistent, coherent framework for assessing and evaluating enterprise technology solutions and vendors. The scoring and framework changes we’ve made are directed toward this effort to make our assessments verifiable, ground them in agreed concepts, and ensure that scoring is articulated, inspectable, and repeatable.

This adjustment is designed to make our evaluations more consistent and coherent, which makes it easier for vendors to participate in research and results in clearer reports for end-user subscribers.

What are the key changes to scoring?

The biggest change is to the feature and criteria scoring in the tables of GigaOm Radar reports. Scoring elements are weighted as they have been in the past, but we do so in a more consistent and standardized fashion between reports. The goal is to focus our assessment scope on the specific key features, emerging features, and business criteria identified as decision drivers by our analysts.

Scoring of these features and criteria determines the plotted distance from the center for vendors in the Radar chart. We are extending our scoring range from a four-point system (0, 1, 2, or 3) to a six-point scoring system (0 through 5). This enables us to recognize truly exceptional products against those that are just very good. It affords us greater nuance in scoring and better informs the positioning of vendors on the Radar chart.

Determining vendor position along the arc of the Radar chart has been refined as well. Analysts previously were asked to determine where they believed solutions should be positioned on the radar—first, to determine if they should occupy the upper (Maturity) or lower (Innovation) hemisphere, then to identify position left-to-right, from Feature Play to Platform Play. Similar to how we’ve extended our feature and criteria scoring, the scheme for determining quadrant position is now more granular and grounded. Analysts must think about each aspect individually—Innovation, Maturity, Feature Play, Platform Play—and score each vendor solution’s alignment accordingly.

We have now adapted how we plot solutions along the arc in our Radar charts, ensuring that the data we’re processing is relevant to the purchase decision within the context of our reports. Our scoring focuses primarily on key differentiating features and business criteria (non-functional requirements), then, to a lesser extent, on emerging features that we expect to shape the sector going forward.

For example, when you look at Feature Play and Platform Play, a feature-oriented solution is typically focused on going deeper, perhaps on functionality, or on specific use cases or market segments. However, this same solution could also have very strong platform aspects, addressing the full scope of the challenge. Rather than deciding one or the other, our system now asks you to provide an independent score for each.

Keep in mind, these aspects exist in pairs. Maturity and Innovation are one pair, and Feature and Platform Play the other. One constraint is that paired scores cannot be identical—one “side” must be higher than the other to determine a dominant score that dictates quadrant residence. The paired scores are then blended using a weighted scheme to reflect the relative balance (say, scores of 8 and 9) or imbalance (like scores of 7 and 2) of the feature and platform aspects. Strong balanced scores for both feature and platform aspects will yield plots that tend toward the y-axis, signifying an ideal balance between the aspects.

But you have to make a choice, right?

Yes, paired scores must be unique; the analysts must choose a winner. It’s tough, but in those situations, they will be giving scores like 6 and 5 or 8 and 7, which will typically land them close to the middle between the two aspects. You can’t have a tie, and you can’t be right on the line.

Is Platform Play better than Feature Play?

We talk about this misconception a lot! The word “platform” carries a lot of weight and is a loaded term. Many companies market their solutions as platforms even when they lack aspects we judge necessary for a platform. We actually considered using a term other than Platform Play but ultimately found that platform is the best expression of the aspects we are discussing. So, we’re sticking with it!

One way to get clarity around the Platform and Feature Play concepts is to think in terms of breadth and depth. A platform-focused offering will feature a breadth of functionality, use-case engagement, and customer base. A feature-focused offering, meanwhile, will provide depth in these same areas, drilling down on specific features, use cases, and customer profiles. This can help reason through the characterization process. In our assessments, we ask, “Is a vendor deepening its offering on the feature side, or are there areas it intentionally doesn’t cover and instead relies on third-party integrations?” Ultimately, think of breadth and depth as subtitles for Platform Play and Feature Play.

The challenge is helping vendors understand the concept of platform and feature and how it is applied in scoring and evaluating products in GigaOm Radar reports. These are not expressions of quality but character. Ultimately, quality is expressed by how far each plot is from the center—the closer you are to that bullseye, the better. The rest is about character.

Vendors will want to know: How can we get the best scores?

That’s super easy—participate! When you get an invite to be in our research, respond, fill out the questionnaire, and be complete about it. Set up a briefing and, in that briefing, be there to inform the analyst and not just make a marketing spiel. Get your message in early. That will enable us to give your product the full attention and assessment it needs.

We cannot force people to the table, but companies that show up will have a leg up in this process. The analysts are informed, they become familiar with the product, and that gives you the best chance to do well in these reports. Our desk research process is robust, but it relies on the quality of your external marketing and whatever information we uncover in our research. That creates a potential risk that our analysis will miss elements of your product.

The other key aspect is the fact-check process. Respect it and try to stay in scope. We see companies inserting marketing language into assessments or trying to change the rules of what we are scoring against. Those things will draw focus away from your product. If issues need to be addressed, we’ll work together to resolve them. But try to stay in scope and, again, participate, as it’s your best opportunity to appeal before publication.

Any final thoughts or plans?

We’re undergirding our scoring with structured decision tools and checklists—for example, to help analysts determine where a solution fits on the Radar—that will further drive consistency across reports. It also means that when we update a report, we can assess against the same rubric and determine what changes are needed.

Note that we aim to update Key Criteria and Radar reports based on what has changed in the market. We’re not rewriting the report from scratch every year; we’d rather put our effort into evaluating changes in the market and with vendors and their solutions. As we take things forward, we will seek more opportunities for efficiency so we can focus our attention on where innovation comes from.

The post What’s the Score? appeared first on Gigaom.

SIEM and SOAR – Will They or Won’t They?

A considerable percentage of SIEM vendors share a vision for how to help security operations centers deal with the high volume and complexity of security attacks. These vendors are integrating acquired SOAR solutions or natively developing SOAR capabilities to create a unified platform for security analysts.

A combined SIEM and SOAR solution will make up most of the SOC analyst’s daily toolset and reallocate their brain power from conducting repetitive analysis and response tasks, to only investigating incidents of significant interest and importance. 

The core of this offering therefore enables the SOC to address the biggest hindrance for analysts: volume. Instead of dealing with high-volume, low-complexity attacks, businesses can dedicate analysts to truly important attacks, such as unknown unknowns or zero-day attacks. 

We can define this combined toolset of SIEM and SOAR as “autonomous SOC solutions”  as, with adequate configurations, the number of analysts will no longer be the only way a business can scale up its security operations to deal with more threats. I’ve been covering this in the coming Key Criteria report.

I previously wrote about the positive outlook for standalone SOAR, so I want to preface this by saying that I’m not one to make broad predictions. My role as an industry analyst is to observe vendors’ strategic decisions and their responses to customer demands, this being one such trend. We’ve got a large sample size of SIEM vendors, having identified roughly 40 solutions. Out of these, as many as 16 vendors have entered the autonomous SOC arena.

Between the 16 vendors we identified to deliver these autonomous SOC solutions and the remaining 20+ pure-play SIEM vendors, I can only classify this as a “will-they-won’t-they” situation on whether SIEM and SOAR will remain distinct or merge. 

Quiet developments over noisy acquisitions

Security acquisitions make a lot of noise in the market, and SOAR acquisitions have been some of the loudest. Google acquired Siemplify, Devo acquired LogicHub, Fortinet acquired CyberSponse, Palo Alto Networks acquired Demisto, Splunk acquired Phantom and was acquired by Cisco, Sumo Logic acquired DFLabs, and Micro Focus acquired Atar Labs, which, in turn, was acquired by OpenText. 

With this in mind, most observers would expect the majority of vendors delivering autonomous SOC solutions to have acquired and integrated a SOAR solution. However, if we filter out SIEM vendors who’ve acquired SOAR solutions but have not integrated them into a unified solution—the likes of Google, IBM, Fortinet, and Splunk — we quickly find that the majority of vendors featured in the Radar report for Autonomous SOC solutions have actually developed their solutions in-house. 

So, in the acquire-then-integrate bucket, we have Devo, LogPoint, Sumo Logic, and OpenText.

In the developing-SOAR-in-house category, we have a threefold increase in the number of vendors, including Elastic, Exabeam, Hunters, Huntsman, Logrhythm, NetWitness, Palo Alto Networks, Securonix, Logsign, ManageEngine, Microsoft, and Rapid7.

Coexisting solutions

SIEM has so far stood the test of time, so organizations are unlikely to swap out their existing solutions unless there’s a compelling reason to do so. SIEM is also an important checkbox for many regulations. As my esteemed colleague Chris Ray points out, as long as security standards have the acronym SIEM on the requirements list, the solution and its name will remain a constant, regardless of how much it evolves from a technical point of view. SOAR also has a strong mandate for existing as a standalone solution, which we further explore here.

So, as much as we like to put things in boxes, the reality is that SIEM, SOAR, and solutions combining the two will coexist for the foreseeable future. The only force that will validate the autonomous SOC market is whether customers are willing to invest money in combined solutions, replacing or augmenting the individual parts of incumbent tools. 

However, it would be disadvantageous for vendors with a combined solution to position themselves in the same space with pure-play SIEM competitors. Simply reframing them as a ‘next-generation SIEM’ doesn’t capture the extensive difference between a SIEM and an autonomous SOC solution. So, even if we need to use SIEM as a requirement for compliance, these vendors need to distinguish their solutions in a very crowded SIEM market.

The post SIEM and SOAR – Will They or Won’t They? appeared first on Gigaom.

What’s the Business Value of AI? A Systems Engineer’s Take

Across four decades, I have worked as a systems engineer in the information technology (IT) industry designing, architecting, configuring computing systems and representing them to buyers and operations teams. 

I’ve learned to see it as the art of designing IT solutions that amplify human productivity, capability, and creativity. For these aspirations to be realized however, these solutions need to be reframed and translated into business value for acquisition and implementation. 

It’s a tricky proposition in this hypercompetitive world, which we’re seeing unfold in front of our eyes due to the current buzz around AI and Large Language Models (LLMs). The ‘arrival’ of AI onto the scene is really the delivery of the promise and aspirations of six decades of iterative effort.

However, its success – defined in terms of business value – is not a given. To understand this, let me first take you back to a technical article I came across early on in my career. “All machines are amplifiers,” it stated in a simple and direct manner. That statement was an epiphany for me. I’d considered amplifiers as just a unit in a stereo system stack or what you plugged your guitar into. 

Mind blown.

As I have pondered this realization across my career, I have come to consider IT as a collection of machines offering similar amplification, albeit on a much broader scale and with greater reach.

IT amplifies human productivity, capability, and creativity. It allows us to do things we could never do before and do them better and faster. It helps us solve complex problems and create new opportunities – for business and humanity.

To augment or to replace – THAT was the question

However, amplification is not an end in itself. In the 1960s, two government-funded research labs on opposite sides of the University of Berkeley Stanford campus pursued fundamentally different philosophies. One believed that powerful computing machines could substantially increase the power of the human mind. The other wanted to create a simulated human intelligence. 

These efforts are documented in John Markoff’s book, “What The Dormouse Said – How the Sixties Counterculture Shaped the Personal Computer Industry”.

One group worked to augment the human mind, the other to replace it. Whilst these two purposes, or models, are still relevant to computing today, augmenting the human mind proved to be the easier of the two to deliver – with a series of miniaturization steps culminating in the general consumer availability of the personal computer (PC) in the 1980s. PCs freed humans to be individually productive and creative, and changed how education and business were done around the globe. Humanity rocketed forward and has not looked back since.

Artificial Intelligence (AI) is now becoming commercially viable and available at our fingertips to replace the human mind. It is maturing rapidly, being implemented at breakneck speeds in multiple domains, and will revolutionize how computing is designed and deployed in every aspect from this point forward. While it came to fruition later than its 1960s sibling, its impact will be no less revolutionary with, perhaps, an end-state of intelligence that can operate itself.

Meanwhile, automation on the augmentation front has also rapidly advanced, enabling higher productivity and efficiencies for humans. It’s still a human world, but our cycles continue to be freed up for whatever purpose we can imagine or aspire to, be they business or personal endeavors.

Systems engineering – finding a path between trade-offs

From a high-level fundamental compute standpoint, that’s all there really is – augment or replace. Both models must be the starting point of any system we design. To deliver on the goal, we turn to systems engineering and design at a more detailed, complex, and nuanced level. 

The primary task has always been simple in concept – to move bits (bytes) of data into the processor registers where it can be operated upon. That is, get data as close to the processor as possible and keep it there for as long as practical. 

In practice this can be a surprisingly difficult and expensive proposition with a plethora of trade-offs. There are always trade-offs in IT. You can’t have it all.  Even if it were technically feasible and attainable you couldn’t afford it or certainly would not want to in almost every case. 

To accommodate this dilemma, at the lower levels of the stack, we’ve created a chain of different levels of various data storage and communications designed to feed our processors in as efficient and effective a manner as practical, enabling them to do the ‘work’ we request of them. 

For me, then, designing and engineering for purpose and fit is, in essence, simple. Firstly, am I solving for augmentation or replacement? Secondly, where’s the data, and how can I get it where it needs to be processed, governed, managed, and curated effectively? 

And one does not simply store, retrieve, manage, protect, move, or curate data. That stuff explodes in volume, variety, and velocity, as we are wont to say in this industry. These quantities are growing exponentially. Nor can we prune or curate it effectively, if at all, even if we wanted to. 

Applying principles to the business value of AI

All of which brings us back to the AI’s arrival on the scene. The potential for AI is huge, as we are seeing. From the systems engineer’s perspective however, AI requires a complete data set to enable the expected richness and depth of the response. If the dataset is incomplete, ipso facto, so is the response – and, thus, it could be viewed as bordering on useless in many instances. In addition AI algorithms can be exhaustive (and processor-intensive) or take advantage of trade-offs. 

This opens up a target-rich environment of problems for clever computer scientists and systems engineers to solve, and therein lies the possibilities, trade-offs, and associated costs that drive all decisions to be made and problems to be solved at every level of the architecture – user, application, algorithm, data, or infrastructure and communications.

AI has certainly ‘arrived’, although for the systems engineer, it’s more a continuation of a theme, or evolution, than something completely new. As the PC in the 1980s was the inflection point for the revolution of the augmentation case, so too is AI in the 2020s for the replacement case. 

It then follows, how are we to effectively leverage AI? We will need the right resources and capabilities in place (people, skills, tools, tech, money, et al) and the ability within the business to use the outputs it generates. It resolves to business maturity, operational models and transformational strategies.

Right now I see three things as lacking. From the provider perspective, AI platforms (and related data management) are still limited which means a substantial amount of DIY to get value out of them. I’m not talking about ChatGPT in itself, but, for example, how it integrates with other systems and data sets. Do you have the knowledge you need to bring AI into your architecture?

Second, operational models are not geared up to do AI with ease. AI doesn’t work out of the box beyond off-the-shelf models, however powerful they are. Data scientists, model engineers, data engineers, programmers, and operations staff all need to be in place and skilled up. Have you reviewed your resourcing and maturity levels?

Finally, and most importantly, is your organization geared up to benefit from AI? Suppose you learn a fantastic insight about your customers (such as the example of vegetarians being more likely to arrive at their flights on time), or you find out when and how your machinery will fail. Are you able to react accordingly as a business?

If the answer to any of these questions is lacking, then you can see an immediate source of inertia that will undermine business value or prevent it altogether. 

In thinking about AI, perhaps don’t think about AI… think about your organization’s ability to change and unlock AI’s value to your business.

The post What’s the Business Value of AI? A Systems Engineer’s Take appeared first on Gigaom.

Case Study: Ingram Micro

“GigaOm and Ingram Micro work with partners to drive strategic growth and deliver more value to technology buyers. GigaOm goes beyond technology to enable partners to better connect technology solutions with customers’ business operating models, people and process organizational maturity, and transformational aspirations.

“Before GigaOm, partners were challenged to access this level of go-to-market positioning and messaging, strategic roadmap advice, and sales enablement. Ingram Micro and partners have benefited tremendously from the great strategic advice, research, and sales enablement produced by GigaOm.”

–Karl Connolly, technologist and field CTO at Ingram Micro.

Context setting

Ingram Micro is one of the world’s largest distributors of IT systems and services, with operations in 61 countries and reaching nearly 90% of the world’s population. Ingram Micro works with 1,500 original equipment manufacturer (OEM) vendor partners and 170,000 technology solution provider customers across the cloud, AI, data, infrastructure, security, storage, and networking. Ingram Micro’s mission is to enable technology channel partners to accelerate growth, run better and more profitably, and deliver more value for their customers—business and technology decision-makers.

Ingram Micro turned to GigaOm to enable its reseller and OEM partners to drive strategic growth by connecting their technology solutions to the strategic operational business models, organizational maturity, and transformation goals of customer organizations.

Karl Connolly is technologist and field CTO at Ingram Micro. He says the relationship with GigaOm provides unique value to Ingram Micro’s channel-focused business.

“GigaOm understands customers’ motivations so they can help partners better connect their technology solutions to how customers run their business,” says Connolly. “Ingram Micro is a big brand that is recognized, but is one step removed from the end customer by nature of its channel-focused business model. Its capabilities are not necessarily understood or known to end customers and resellers—gaining the voice and trust of end customers always takes place via an Ingram Micro reseller.”

Figure 1. Karl Connolly, Technologist and Field CTO, Ingram Micro

The goal was to help end-user organizations make better-informed technology choices to enable their businesses, explains Connolly. “The list of choices of technology solutions available to a customer can be daunting, and with each vendor competing for said customer, informed decisions can be clouded by misrepresentation, ambiguity, and partner preference.”

Ingram Micro understood that to get a clearer picture of a customer’s business and operational model requires organizational people, process maturity, and transformational aspirations. A better understanding of how to engage with end-user organizations would positively impact Ingram Micro, vendor, and partner sales revenue.

Why GigaOm?

The company has worked with analyst firms before, including IDC, Gartner, and Forrester. However, as Ingram Micro developed its go-to-market approaches and channel sales strategies, it saw GigaOm’s business and technical practitioner-led approach, covering C-level leaders, architects, and engineers, as a major asset.

“GigaOm’s advisors are experienced strategists, practitioners, and engineers who have been IT buyers and consumers,” explains Connolly. “This unique perspective provides us with the ‘voice of the customer,’ allowing us to connect technology solutions to C-level, line of business, and architect business value-centric strategy based on customer’s operational business models, organizational maturity, and transformation goals, which has proven instrumental in shaping our go-to-market strategy.”

GigaOm’s unique position as a C-level, line of business, architect, and engineer practitioner-led advisory, research, and enablement company, along with its voice-of-the-customer understanding, enables it to present technology in a way end-user organizations can connect with.

“GigaOm helps us and our partners make the case to end users for adoption of a technology area based on brand or specific product by being the voice of the customer,” says Connolly. “The research and insights enable informed decisions based on experience, testing, and unbiased assessment from the end user perspective. We are afforded opinions, perspectives, and facts that aren’t attainable from other firms, or from end users directly.”

GigaOm sees research as a tool to enable stakeholders on all sides: Vendors and partners understand how to talk to customers better, and end-users become better equipped to decide between complex offerings. It was this flexible approach that brought Ingram Micro and GigaOm together.

Figure 2. GigaOm and Ingram Micro Engagement Model

Aspects of GigaOm’s offering align with Ingram Micro’s vision, strategy, and approach, not least GigaOm’s brand value and partnership approach. “GigaOm has built a good solid brand and has credibility, and the DNA of the company fits with ours on channel partners,” says Connolly.

“We particularly appreciate GigaOm’s strong connections with many of the OEMs supported by Ingram Micro. This synergy has further enhanced the value of their insights for our business,” says Connolly. “We and our partners can also generate revenue with GigaOm by reselling GigaOm research and services to end-user customers; that is a unique capability.”

Solution and Approach

The partnership between Ingram Micro and GigaOm has been directly targeted at relationship building and enabling the company to engage in more strategic conversations about technology solutions. To kick things off, GigaOm CTO Howard Holton presented to Ingram Micro solution architects covering strategic areas such as CAPEX to OPEX.

These customer-led perspectives helped solutions teams better understand how to connect with OEM vendors based on a balanced perspective of their offerings. “Having an unbiased expert in Howard is invaluable,” says Connolly. “Often, teams are informed by the vendor, which can be limiting.”

As a result of the engagement, the Ingram Micro team gained a firmer foundation for discussing solutions with vendors and solution providers, enabling them to better drive customer conversations.

In addition, GigaOm participated in several conversations across multiple service providers and other partner firms, including T-Mobile, Betacom, and Otava. The goal was to develop effective business and technical enablement and sales strategies with customers based on their industry vertical, with C-level, line of business, architect, and engineer positioning, messaging, benchmarking, market evaluation, and cost analysis services from C-level to engineer. “GigaOm’s openness and willingness to build custom engagements for the partner was very well received, with the right set of assets delivered,” says Connolly.

Benefits

Overall, GigaOm’s engagement with Ingram Micro enabled the company to hone its strategies and sales plays based on the language of the customer. “GigaOm’s service of advisory and validation is a step above what its peers provide. The value of the advisory, coming from the vantage point of one who has done it, is more compelling than that of one who has studied or read up on a subject,” says Connolly.

Not only this, but the interaction helped Ingram Micro identify new opportunities within its partner portfolio. A specific example is Betacom: “GigaOm was the reason Ingram Micro became aware of Betacom, which is becoming a strategic partner for Ingram Micro in the 5G and industrial manufacturing space,” says Connolly. Manufacturing and Operational Technology (OT) is undergoing rapid digital transformation and is a relatively new industry vertical for Ingram Micro—an unexpected benefit was from GigaOm’s Holton, who had previously worked and consulted at major industrial companies.

“Our advanced solutions team leveraged Howard’s operational business insight and strategic expertise to gain understanding of the manufacturing and OT buyer across real world insights, considerations, and buying process advice. This helped the team get informed and prepared for partner meetings and our MxD partnership.”

GigaOm’s research and insights have directly impacted growth for Ingram Micro and its partners, says Connolly. “GigaOm services enable driving increased share of wallet by promoting more of what Ingram Micro offers to its partners in product, services, and solutions.”

Connolly says this direct benefit emanates from several factors:

Thought leadership: “An advisory company like GigaOm that has a consumer community consisting of tech and business influencers is a good way to be seen as thought leaders. GigaOm can potentially help buyers understand their options as it pertains to a new concept, program, or technology to support their operating model and transformation goals.”

Credibility & people cost savings: “GigaOm can help less mature or cost-conscious partners gain consulting capabilities without the need to staff CIOs, field CTOs, architects, and engineers in-house that could cost upward of $1 million dollars, instantly providing credibility across a broad domain of markets, services, and solutions.”

Independence: “GigaOm can act as a symbiotic extension to a partner, as it has no desire or aspiration to become a VAR and is purely there to help the partner uncover more opportunity.”

Opportunity: “GigaOm insights enable partners and Ingram Micro to stay current on macro themes and the OEMs filling those spaces, which has tangential and heretofore unmeasured value.”

Content: “Candid feedback on some of the materials and presentations we gave has been helpful in shaping how future content can be crafted and delivered to better connect with customer’s business operating model, organizational maturity and transformation goals.”

Next Moves

Ingram Micro and GigaOm will continue to build on the current success with individual partners and customers, and Ingram Micro will use the GigaOm partnership to further its position with service providers. “GigaOm will aid our partners to better connect with customers, gain visibility and mindshare in the market, specifically as we engage with our MNO and private network provider partners who can use enablement research and advisory services from GigaOm to become better known as leaders in a specific area, such as private 5G or connected workers,” says Connolly.

And what about Ingram Micro? “Beyond additional sales opportunities and new partnerships, our relationship with GigaOm can inform our portfolio and the solutions we offer our partners, elevating us beyond the traditional role that distribution plays.”

The post Case Study: Ingram Micro appeared first on Gigaom.

Find the soul