Since I slowly becomes the chatops specialist where I work, I get to think I want more. Since the beginning we have been interacting with programs essentially with graphical interfaces. First asynchronous if we think about the web, now more and more synchronous, but they are interfaces that are not human. They are designed for giving control to the operators of those interfaces.
But the more we advance in autonomy of our programs, the more we should trust them to sort the information by priorities. The interfaces like Siri, Echo, are much more ‘human’ and conversational. It doesn’t take a genius to speculate that interfaces are going to die one day, except for very specialized usages, and more interaction will be just more conversational.
In the course of my development of interactive agents for technical needs, I noticed that adding just a little bit of intelligence and memory in those agents goes a long way in usability. Especially in chatops, a lot of the actions required from those agents are predicable and repetitive. The development of new features should follow the recognition of those patterns and shorten the path to accomplish some actions. That’s pretty much my job.
But coding this continually is not very cost effective. Tools also change, then patterns evolve. Now all I can think about is a way to design an irc bot that learns by itself. Some program that does real meta-programming and considers its commands as data rather than hard-coded pre-conceived path for the information to flow.
If you know some tools that already do that, can you fire me a mail?