This is actually a huge deal for both AI AND the environment.
![Researchers upend AI status quo by eliminating matrix multiplication in LLMs](http://cdn.arstechnica.net/wp-content/uploads/2024/06/AI_lightbulb-760x380.jpg)
Researchers upend AI status quo by eliminating matrix multiplication in LLMs
arstechnica.com
Non Sequiturs is the personal blog of Michael Argentini.
I'm a software developer and Managing Partner for Fynydd and Blue Sequoyah Technologies, the project lead for Coursabi, and Āthepedia founder. I also have several nerdy open source projects on Github.
I'd describe myself as an Oxford comma advocate, autodidact, aspiring polymath, and boffin, with a mechanical keyboard addiction. You can also find me on Mastodon.
This is actually a huge deal for both AI AND the environment.
arstechnica.com
By default ollama runs on the localhost IP address of 127.0.0.1. But if you want to host it on macOS somewhere else, it requires a system-wide environment variable change. The developers of ollama recommend running this command, and then relaunching ollama:
launchctl setenv OLLAMA_HOST "0.0.0.0"
This works great! If you want to run this automatically at startup you can use a small AppleScript to create an application, and then add that application to your Login Items.
First, open Script Editor and paste the following code:
do shell script "launchctl setenv OLLAMA_HOST \"0.0.0.0\""
tell application "Ollama" to run
This will bind all IP addresses to ollama and then launch it. Obviously, you can bind any specific IP address you like.
Next, in the File menu use the Export command to create an Application that is signed to run locally. Quit Script Editor and then put the new application in your Applications folder.
Finally, search for “Login Items” in System Settings. Remove any existing Ollama application from the list, and add your application.
Now when you restart and sign in, your application will launch, set the environment variable for the host binding, launch Ollama, and quit.
Given all the chaos surrounding the Sam Altman firing from OpenAI it looks like AI really is threatening people’s jobs. Well, specific people anyway.
StackOverflow has introduced Labs, an enhanced developer experience that uses AI to streamline the process of insulting and demeaning people looking for help.
stackoverflow.co
Generative #AI platforms like #OpenAI #ChatGPT and DALL-E have both excited and terrified information workers. But #Amazon #Bedrock might be the one service that truly revolutionizes Internet platforms by offering a more broad AI solution that does it all, integrated with #AWS.
aws.amazon.com
Amazon CodeWhisperer looks like a solid alternative to #Github #Copilot in the #AI coding companion space. Surprising that #Amazon beat #Google to this market. Even #Apple should be here at this point given that they own #Xcode.
aws.amazon.com
How will narrow AI technologies like #ChatGPT, #StableDiffusion, and others, affect jobs in the next decade and beyond? Will it replace human workers? No. It's more likely that humans using narrow #AI tools will replace other humans who do not.
Every tech company is trying to come up with an AI strategy for 2023. We already know the answer: tack “AI powered” next to every existing product feature or service offering name.
Non Sequiturs is the personal blog of Michael Argentini.
I'm a software developer and Managing Partner for Fynydd and Blue Sequoyah Technologies, the project lead for Coursabi, and Āthepedia founder. I also have several nerdy open source projects on Github.
I'd describe myself as an Oxford comma advocate, autodidact, aspiring polymath, and boffin, with a mechanical keyboard addiction. You can also find me on Mastodon.
By using this website you accept our privacy policy. Choose the browser data you consent to allow: