Quantcast
Viewing all articles
Browse latest Browse all 16267

Will we have a DT 4 in the near future?

That’s definitely the part of it. Right now Devon rightfully looks at what OpenAI et al produces and says it’s unreliable, potentially dangerous, slow and expensive compared to what’s built-in to DT. Their customers appreciate this stance very much. However, Devon also sees what people are valuing about the input/output of an LLM (using LLM as shorthand for all the newly popular tools this decade.)

So, Devon is looking into expanding capabilities to match changing expectations safely and performantly, while all these other companies and communities corralling LLMs/vector searches/expert systems to be performant and provide safe output. It’s a race with some pretty interesting technical and community aspects.

The Christianson model says Devon can look for attributes of the LLM that will obviate features of DT and that Devon is unwilling to give up. This can be for multiple reasons; some examples might be a redesign is too expensive, losing sales or customers short-term is expensive, they would violate a company value.

It also says Devon can look for aspects of DT that will hinder adoption of the low cost entrants’ appealing features, even if Devon is watching closely and plans to enthusiastically adopt them.

Finally, they can look at the financial/operational structure of Devon to see how they would handle changes to the business.

Dealing with those questions earlier means better software and healthier finances in the long term.

Thanks for keeping the discussion going after all!


Viewing all articles
Browse latest Browse all 16267

Trending Articles