Tech­ni­cal arti­cles

Artificial Intelligence – the next level

Arti­fi­cial intel­li­gence (AI) is no sci­ence fic­tion any­more but has long been an inte­gral part of our every­day life. It is con­tained in count­less appli­ca­tions and tools that make our lives eas­ier in both our pri­vate and pro­fes­sional envi­ron­ments. The major­ity of the assis­tive sys­tems that we use every day belong to the so-​called weak AI. Such sys­tems usu­ally work rule-​based and are lim­ited to a clearly defined area of appli­ca­tion. They are not able to think out­side the box. The lat­ter is only pos­si­ble if AI is used, which is based on a mod­ern under­stand­ing of Machine Learn­ing.

So-​called Deep Learn­ing algo­rithms are based on arti­fi­cial neural net­works whose struc­ture and func­tion is mod­elled on the neu­rons of the human brain. Due to a large infor­ma­tion base and the struc­ture of the neural net­works, deep learn­ing sys­tems can inde­pen­dently link to new con­tent. They are also able to cre­ate analo­gies with­out being con­fronted with cer­tain key words in the con­text of a ques­tion. You know com­pa­ra­ble sit­u­a­tions from your every­day com­mu­ni­ca­tion with other peo­ple. For exam­ple, in the case of the state­ment “This is the sports car among tele­phones”, it should be clear to every­one that the word “sports car” is to be under­stood in this con­text as a syn­onym for “top of the range”. Accord­ingly, if you ask a Deep Learning-​based chat­bot or voice­bot the ques­tion “What is the Lon­don of France?”, it will cor­rectly answer with “Paris”. In this exam­ple, the bot presents the appro­pri­ate solu­tion with­out the key­word “cap­i­tal” being men­tioned.

The more mean­ing­ful data such sys­tems process, the more reli­able they become. Thus the “learn­ing effect” ide­ally leads to a con­tin­u­ous Self-​optimization.

But how does Deep Learn­ing work in prac­tice?

Sup­pose an insur­ance com­pany wants to auto­mate its cus­tomer dia­logue and set up a mod­ern voice por­tal to ade­quately answer cus­tomer queries. In the ideal case, there is already a suf­fi­ciently large num­ber of doc­u­mented cus­tomer queries. This data basis is used to train the voice­bot on its industry-​specific field of appli­ca­tion and to train it with the logic and range of for­mu­la­tions used by insur­ance cus­tomers famil­iar. Addi­tion­ally, data from sector-​specific or gen­eral knowl­edge data­bases, for exam­ple online ency­clopae­dias such as Wikipedia, can be added. Basi­cally it is pos­si­ble to adapt the text col­lec­tion exactly to the require­ments of a com­pany.

In order to enable the AI to link and ana­lyze the exist­ing data and to draw pre­cise con­clu­sions, the immensely large data­base must be made man­age­able. Words that are not mean­ing­ful in the appli­ca­tion con­text are deleted, whereas rel­e­vant terms are reduced to the word stem. In this way, instead of var­i­ous word forms (e.g. “asked”, “ask”, “asks” etc.), only the basic form of each word (in this exam­ple: “ask”) is retained, so that terms in the exist­ing text col­lec­tion can be weighted more pre­cisely. The text or input infor­ma­tion is then con­verted into vec­tors. Thus, dis­tances can be cal­cu­lated in vec­tor space and words can be weighted accord­ing to their context-​dependent sig­nif­i­cance.

As soon as the avail­able data­base, mean­ing the entire text col­lec­tion, has been pre­pared, the AI has an easy job. In our exam­ple, the voice por­tal would now have the task of pro­cess­ing incom­ing requests to search for pat­terns based on the train­ing data. In the casu­ally for­mu­lated state­ment “my car is scrap”, for exam­ple, the Deep Learn­ing sys­tem rec­og­nizes the rel­e­vant terms “car” and “scrap” and log­i­cally assigns the query to the depart­ment respon­si­ble for motor vehi­cle insur­ance. In addi­tion, the AI can also take over the com­plete iden­ti­fi­ca­tion of the con­cern by ask­ing the cus­tomer spe­cific ques­tions to spec­ify the dam­age.

With the aid of these deep learn­ing sys­tems, it is even pos­si­ble to process terms that do not appear in the under­ly­ing data set. For exam­ple, if a cus­tomer says, “It’s about my tooth crown”, the request can be processed even if the word “tooth crown” would not be part of the text col­lec­tion. Con­ven­tional bots are not capa­ble of such feats. Mod­ern AI sys­tems, how­ever, are able to assign unknown terms by means of the men­tioned dis­tance mea­sures, as long as they have a sim­i­lar­ity to words con­tained in their data set. In our exam­ple, for the recog­ni­tion of the term “crown” it would be suf­fi­cient if the text col­lec­tion con­tains a word like “den­tal treat­ment”.

One bot, numer­ous advan­tages

Unlike con­ven­tional chat­bots or voice­bots, it is not nec­es­sary to think through all poten­tial dia­logue paths in advance and to con­sider the entire for­mu­la­tion range of the cus­tomers. A sys­tem based on mod­ern AI meth­ods only requires a suf­fi­ciently rep­re­sen­ta­tive, appro­pri­ately pre­pared data­base to be able to work com­pletely inde­pen­dently. For exam­ple, by mak­ing terms unknown to it acces­si­ble with­out out­side help.

The advan­tages are obvi­ous: On the one hand, the qual­ity and flex­i­bil­ity of the dia­logues are improved, on the other hand, the scal­a­bil­ity of the solu­tion is con­sid­er­ably sim­pli­fied. If the insur­ance com­pany from our exam­ple wants to extend its voice por­tal to other com­pany areas or cover a larger num­ber of use cases, no new dia­logue paths have to be worked out, the prepa­ra­tion of a suit­able data­base is suf­fi­cient. The nice side effect? Not only time is saved, but also a lot of money. Fur­ther­more, the main­te­nance effort for mod­ern AI sys­tems is com­par­a­tively low. In order to adjust the chat­bot or voice­bot to changed con­di­tions or new require­ments, only the data set has to be exchanged or adapted. Every­thing else is done by the sys­tem itself and even more: the longer it is used in a cer­tain area of appli­ca­tion, the more reli­able it works.

IP Dynam­ics has the nec­es­sary know-​how and the data­base to make deep learn­ing algo­rithms even more effi­cient through industry-​specific adap­ta­tions, for exam­ple of word dic­tio­nar­ies. In addi­tion, the port­fo­lio includes ready-​made mod­ules, for exam­ple, for con­cern recog­ni­tion (espe­cially in the insur­ance sec­tor) as well as solu­tions for iden­ti­fi­ca­tion and authen­ti­ca­tion (e.g., for infor­ma­tion on the pro­cess­ing sta­tus of trans­ac­tions or the record­ing of miss­ing data) or call­back man­age­ment. A fur­ther, deci­sive advan­tage: In the Con­tact Cen­ter of IP Dynam­ics all rel­e­vant cus­tomer ser­vice chan­nels con­verge – tele­phone, e-​mail, chat, mes­sen­ger, social media. The com­bi­na­tion of a mod­ern AI sys­tem with our omnichan­nel plat­form offers con­sid­er­able poten­tial: In the com­plete solu­tion, all incom­ing data are effort­lessly cap­tured, linked and eval­u­ated. And the AI gets smarter and smarter and smarter and…