Imagine a conversation between you and your computer. The words you speak become a beacon of understanding for the programming, and actions are taken based on what you ask of the screen. Our way of speaking, though socially centered, is able to be translated into a logically centered form of language for machines. Many of the social norms that we have are due, in part or whole, to our ability to communicate through various means of language. However, language is continuously shifting and changing, leading it to naturally evolve alongside technology and computer programming. Technological languages came about in an effort for humans to be allowed a method to translate their ideas into commands for machines to follow. Many people within the tech industry are able to translate from human language to machine language, making them bilingual in both human and computer. The machines however, are just fluent in computer, and cannot truly understand human.

In tech and computing, language can be found within two branches: written, spoken, or visual representations of language within a dataset, or as computer code or programming language. As humans, we are able to translate our human languages, which is what we want to do, into a machine based language – which are codes and scripts for an AI or program to follow. While human language can be translated into computer language nearly seamlessly, there exists an important difference between the languages. Human languages exist in three fields – speech, writing, and gesture – but machine based languages exist only in one field: written. Depending on the context, a computer’s written language can send signals to oral or visual cues, such as in AI chatbots with procedural animation and speech synthesis.  Yet, ultimately the language remains written. Meaning also differs between the languages. Machine languages are used almost exclusively for requests, orders, and logic, whereas human languages are applied in a variety of contexts, such as this blog post.

Computers and AI can be used to analyze human language in detail that often passes us by. For example, a simple linguistic program can analyze which words are used the most in a transcript, or detect filler words in an audio clip. Computer syntax, in conjunction with human language, can show trends in speaker sentimentalities and personality , making it incredibly beneficial in research where demographics and stats are key. It also has use in more creative fields such as scriptwriting, where some linguistic neural networks are able to creatively write passages of text, such as a semi-cohesive story. Ultimately, these projects are done through a conversation of both human and computer language.

For a computer to truly understand and analyze human language, a machine needs to be taught – in it’s own way – how to become fluent in human language. This is achieved, in part, by finding the methods to help a machine understand what is being said in that human language, with the right amount of linguistic training and human input. With a solid dataset and good training, a computer would be able to aid in a programming project regarding linguistic analysis. This could apply in areas like the marketing and recommender fields. The benefits of computers analyzing human linguistics are well worth the effort.

 

Daniel Nordfors