Post by account_disabled on Jan 28, 2024 5:37:25 GMT
The arrival of generative AI has revolutionized the technology industry. The appearance of ChatGPT has opened new doors and made mechanical tasks easier, even offering us inspiration to go further. At the beginning of June 2023, we finally had the long-awaited native application to install ChatGPT on the iPhone. The science behind these new technologies is based on supercomputing: servers with colossal computing capacity that process our requests and, through their algorithms fed with prior information, are capable of responding in natural language. However, a limitation of these systems lies in the immense computing power required to provide us with answers.
It is needed not only to generate the response Buy Phone Number List itself, but also to interpret what we say: our sentences are broken down into small machine-interpretable units, called “tokens,” and this data is sent to the server for processing. We are in a very early phase of all this. We have only been exploring the possibilities of these new language models for two years, which are wonderful for helping us in our daily lives, but they have a small disadvantage: we need an internet connection to send our question tokens and receive the answers. However, I personally believe that the future of this technology is not in the cloud, but much closer to us. Today we are going to conduct a small experiment using an application that recently appeared in the App Store.
Its name may not be very attractive, but Offline Chat can be useful and gives us a clear idea of the future that awaits us. What this app does is use a trained model based on Mistral 7B 0.2, one of the best language models with over 7 billion parameters. Using a distillation technique applied to larger models, this app offers surprising power, even surpassing larger language models like Llama 2, which has almost twice the parameters. As for its execution on the device, it is perfect: it is very compressed and the trained model is very precise. Processing such an amount of information locally, without the need for an Internet connection, is possible thanks to the new processors of the latest iPhones, which have accelerators in machine learning processes.
It is needed not only to generate the response Buy Phone Number List itself, but also to interpret what we say: our sentences are broken down into small machine-interpretable units, called “tokens,” and this data is sent to the server for processing. We are in a very early phase of all this. We have only been exploring the possibilities of these new language models for two years, which are wonderful for helping us in our daily lives, but they have a small disadvantage: we need an internet connection to send our question tokens and receive the answers. However, I personally believe that the future of this technology is not in the cloud, but much closer to us. Today we are going to conduct a small experiment using an application that recently appeared in the App Store.
Its name may not be very attractive, but Offline Chat can be useful and gives us a clear idea of the future that awaits us. What this app does is use a trained model based on Mistral 7B 0.2, one of the best language models with over 7 billion parameters. Using a distillation technique applied to larger models, this app offers surprising power, even surpassing larger language models like Llama 2, which has almost twice the parameters. As for its execution on the device, it is perfect: it is very compressed and the trained model is very precise. Processing such an amount of information locally, without the need for an Internet connection, is possible thanks to the new processors of the latest iPhones, which have accelerators in machine learning processes.