-rick- Posted 6 hours ago Posted 6 hours ago 11 minutes ago, SteamyTea said: I am quite interested in this. I can understand needed a fair bit of RAM to run the LLM. But is there something special about the OS that it needs a massive amount. I know OSs are RAM hungry these days, unlike 35 years ago when an image file was larger in RAM than the OS took up. Can the OS be stripped back to the very basics of what it was originally designed to do? MacOS will use less than 4GB IIRC. But decent LLMs are very memory hungry so I think @Pocster was more focussed on the RAM needed by the LLM not the OS. The smallest 'useful' models might be 16GB and then you need a ton more for context windows + LV cache. These models will still be far behind anything from the main commercial providers. More memory = better model + more context.
SteamyTea Posted 5 hours ago Posted 5 hours ago 16 minutes ago, -rick- said: MacOS will use less than 4GB IIRC Less than I thought, and similar to Windows, which is a bit messy with RAM. Many years ago I used Puppy Linux, that ran totally in RAM, including all the applications. Was pretty good on my old laptop that had only 4 MB. Screen died eventually. Often wonder how it would perform on a more modern machine. I think DSL works the same and that was only about 50 MB for everything. 1
MikeSharp01 Posted 4 hours ago Posted 4 hours ago 9 hours ago, Pocster said: I’ll say it again , it’s insane ! Yes but not, as yet, unlike most technological innovations in the history of technology in that it enables people to do more. flint, the railways, the jet engine, the mobile phone, the LLM, the next thing. What will change that is AGI and although that could happen tomorrow I am not sure the underlying technology we have (digital) will get us there. 1
Pocster Posted 4 hours ago Author Posted 4 hours ago 1 hour ago, SteamyTea said: I am quite interested in this. I can understand needed a fair bit of RAM to run the LLM. But is there something special about the OS that it needs a massive amount. I know OSs are RAM hungry these days, unlike 35 years ago when an image file was larger in RAM than the OS took up. Can the OS be stripped back to the very basics of what it was originally designed to do? Tricky on Mac and windows . You have to remember Mac is more lean than windows . Linux leaner - but more work . I wouldn’t fight this aspect tbh
Pocster Posted 4 hours ago Author Posted 4 hours ago (edited) 15 minutes ago, MikeSharp01 said: Yes but not, as yet, unlike most technological innovations in the history of technology in that it enables people to do more. flint, the railways, the jet engine, the mobile phone, the LLM, the next thing. What will change that is AGI and although that could happen tomorrow I am not sure the underlying technology we have (digital) will get us there. Maybe — but I think that understates what’s different this time. Most past technologies made human labour more productive. LLMs do that too, but they also start to substitute for parts of knowledge work: drafting, coding, support, admin, research, analysis, design, marketing, legal prep, teaching material, etc. That doesn’t require full AGI. You don’t need a conscious machine to reduce headcount; you only need a tool that lets one person do the work that previously needed three, or lets a cheaper worker do work that previously needed a specialist. So even if LLMs are “only” productivity tools, the labour-market effect can still be huge. The Industrial Revolution didn’t replace every worker overnight either — it reorganised whole industries, compressed wages in some areas, created new winners, and made old skills less valuable. My concern isn’t that every job vanishes tomorrow. It’s that large areas of white-collar work become more automated, more competitive, and need fewer entry-level people. That alone is enough to be disruptive without invoking AGI. My project would be a small team . Now it’s 1 person who doesn’t need to manually code . Just this on its own changes everything - a hobby project that proves the workflow … Edited 4 hours ago by Pocster
Pocster Posted 4 hours ago Author Posted 4 hours ago (edited) 1 hour ago, SteamyTea said: Less than I thought, and similar to Windows, which is a bit messy with RAM. Many years ago I used Puppy Linux, that ran totally in RAM, including all the applications. Was pretty good on my old laptop that had only 4 MB. Screen died eventually. Often wonder how it would perform on a more modern machine. I think DSL works the same and that was only about 50 MB for everything. Quite possibly . You also have to take into account your context window ( chat bot window ) so it depends exactly on what your use is for llm . You can go really small I.e low ram footprint but you are comprising reasoning and accuracy . So it really does depend on intended use . Run in 8gb , yes , useful ? Depends on use . Edited 4 hours ago by Pocster
MikeSharp01 Posted 4 hours ago Posted 4 hours ago 2 minutes ago, Pocster said: You don’t need a conscious machine to reduce headcount; you only need a tool that lets one person do the work that previously needed three, No but if you look at it broadly this is the same for the switch from Canals to Railways, from Railways to Roads, 3 minutes ago, Pocster said: knowledge work: drafting, coding, support, admin, research, analysis, design, marketing, legal prep, teaching material, etc Much of these are relatively new work and again if you look at it many jobs today were not around 5 years ago. I am in the optimist camp - I am optimistic that this technology will be good for us. I sometimes move towards the Agnostic but I have never felt like being pessimistic about it. For me this is just another technology - learn to use it and make your fortune, ignore it at your peril and it won't go away. Essentially it is capable of making the cake bigger for all of us.
Pocster Posted 4 hours ago Author Posted 4 hours ago (edited) 8 minutes ago, Pocster said: Maybe — but I think that understates what’s different this time. Most past technologies made human labour more productive. LLMs do that too, but they also start to substitute for parts of knowledge work: drafting, coding, support, admin, research, analysis, design, marketing, legal prep, teaching material, etc. That doesn’t require full AGI. You don’t need a conscious machine to reduce headcount; you only need a tool that lets one person do the work that previously needed three, or lets a cheaper worker do work that previously needed a specialist. So even if LLMs are “only” productivity tools, the labour-market effect can still be huge. The Industrial Revolution didn’t replace every worker overnight either — it reorganised whole industries, compressed wages in some areas, created new winners, and made old skills less valuable. My concern isn’t that every job vanishes tomorrow. It’s that large areas of white-collar work become more automated, more competitive, and need fewer entry-level people. That alone is enough to be disruptive without invoking AGI. My project would be a small team . Now it’s 1 person who doesn’t need to manually code . Just this in its own changes everything Edited 4 hours ago by Pocster
Pocster Posted 4 hours ago Author Posted 4 hours ago 2 minutes ago, MikeSharp01 said: No but if you look at it broadly this is the same for the switch from Canals to Railways, from Railways to Roads, Much of these are relatively new work and again if you look at it many jobs today were not around 5 years ago. I am in the optimist camp - I am optimistic that this technology will be good for us. I sometimes move towards the Agnostic but I have never felt like being pessimistic about it. For me this is just another technology - learn to use it and make your fortune, ignore it at your peril and it won't go away. Essentially it is capable of making the cake bigger for all of us. I agree it can make the cake bigger. I’m not saying AI is only bad or that we should ignore it. But “the cake gets bigger” doesn’t mean the slices are evenly distributed. Yes, canals to railways to roads changed employment. But those transitions still destroyed some jobs, shifted power, and forced people to retrain. The fact society eventually adapted doesn’t mean the disruption wasn’t real for the people caught in it. The difference here is speed and breadth. LLMs are not replacing one transport system with another. They touch almost every desk-based industry at once: coding, admin, sales, marketing, support, accounts, legal prep, design, analysis, documentation. My own project is a good example. What would once have needed a small team is now potentially one person directing an LLM, with the AI doing much of the manual coding and iteration. That is brilliant for me as the person using it, but it also means fewer people are needed to produce the same output. So yes, learn to use it. I completely agree. But that doesn’t remove the labour-market issue. In fact it proves it: those who use it well become far more productive, and those who don’t are under pressure. That is not just “another technology” in a mild sense. That changes the structure of work.
Pocster Posted 4 hours ago Author Posted 4 hours ago Another point is the speed of improvement. I accept local LLMs are not frontier models. They are behind the best cloud systems. But the pace is ridiculous. Every few weeks there seems to be a better open/local model, better quantisation, better tooling, better context handling, better coding ability, or better inference speed. That matters because the argument is not “can today’s model replace everyone?” The argument is “where is this going over the next 3, 5, or 10 years?” I’ve never personally seen a technology move this fast. With most technologies you get gradual product cycles. With LLMs, the capability jump over months is noticeable. A model that felt barely useful a year or two ago can now write, debug, explain, summarise, plan and generate code well enough to materially change how one person works. So yes, today’s local models are not AGI and not frontier. But the gap is closing fast enough that dismissing this as just another normal productivity tool feels complacent.
MikeSharp01 Posted 4 hours ago Posted 4 hours ago 1 minute ago, Pocster said: But that doesn’t remove the labour-market issue. You are absolutely correct there and I can see it in front of me with my students - getting jobs in coding is now very difficult but getting jobs in integrating LLM coding with business objectives is booming. 3 minutes ago, Pocster said: But “the cake gets bigger” doesn’t mean the slices are evenly distributed. That is also true - just 7 people, all of whom have a finite amount of life left, are making all the headway and a lot of the profit BUT underlying that profit is enough people able to pay for their products when that dries up it all goes phut! 3 minutes ago, Pocster said: I’ve never personally seen a technology move this fast. Nope, but then maybe the next technology will follow the trend and be even faster, each technology of the recent past has had faster adoption curves than the last one its just the way it is and, so far, we have not seen a breaking of the pattern in my view. 1
Pocster Posted 4 hours ago Author Posted 4 hours ago Yes, and China has already started grappling with this. Chinese courts have reportedly ruled that companies cannot simply fire someone purely because AI can do the job cheaper. That proves this is not just imaginary pessimism — governments and courts are already seeing AI replacement as a labour-market issue. And that is exactly my point. If AI were merely “another productivity tool”, why would courts need to decide whether workers can be dismissed because an AI system now performs the role? The fact they are having to rule on it shows the disruption is real. It may make the cake bigger overall, but it can still destroy specific jobs, squeeze wages, and collapse small teams into one person plus AI. That is a major structural change, not just canals becoming railways.
Pocster Posted 4 hours ago Author Posted 4 hours ago 1 minute ago, MikeSharp01 said: You are absolutely correct there and I can see it in front of me with my students - getting jobs in coding is now very difficult but getting jobs in integrating LLM coding with business objectives is booming. That is also true - just 7 people, all of whom have a finite amount of life left, are making all the headway and a lot of the profit BUT underlying that profit is enough people able to pay for their products when that dries up it all goes phut! Nope, but then maybe the next technology will follow the trend and be even faster, each technology of the recent past has had faster adoption curves than the last one its just the way it is and, so far, we have not seen a breaking of the pattern in my view. Coder who uses llm absolutely! . It’s not about writing code anymore - as I said it’s orchestration of the llm ! ( using the tool ) The issue I see is less juniors / graduates and when ‘ seasoned ‘ programmer / se retire who carries the knowledge/ understanding forward ?
MikeSharp01 Posted 4 hours ago Posted 4 hours ago 2 minutes ago, Pocster said: why would courts need to decide whether workers can be dismissed because an AI system now performs the role? I suspect because they can make laws like this in China - we cannot do that here. We have always adapted to new technologies and we will again, it will be tough for many but an opportunity for many also. 1 minute ago, Pocster said: when ‘ seasoned ‘ programmer / se retire who carries the knowledge/ understanding forward ? This is the knub of the issue and here you and I can share much more common ground - this is down to short-sighted entrepreneurs not realising that if you want longevity you need a pipe line. This is the problem we had with apprenticeships in the 1980s, 90s and noughties here we can demand intervention to stop people saving money and pocketing profit in the short term while destroying productivity in the long term - this we can legislate for. Those who neglect history are destined to repeat it. 1
-rick- Posted 4 hours ago Posted 4 hours ago 2 minutes ago, MikeSharp01 said: I suspect because they can make laws like this in China - we cannot do that here. We have always adapted to new technologies and we will again, it will be tough for many but an opportunity for many also. We do have rules about firing workers to replace them with cheaper ones. Things like TUPE. But our current breed of politicians tend to focus on the wrong things and AI regulation has been low on their list. Especially since Trump got elected as they seem particularly wary of doing anything that might upset him and the tech industry makes sure he knows about anything they don't like coming from us. 1
SimonD Posted 2 hours ago Posted 2 hours ago 1 hour ago, Pocster said: Coder who uses llm absolutely! . It’s not about writing code anymore - as I said it’s orchestration of the llm ! ( using the tool ) The issue I see is less juniors / graduates and when ‘ seasoned ‘ programmer / se retire who carries the knowledge/ understanding forward ? 1 hour ago, MikeSharp01 said: This is the knub of the issue and here you and I can share much more common ground - this is down to short-sighted entrepreneurs not realising that if you want longevity you need a pipe line. This is the problem we had with apprenticeships in the 1980s, 90s and noughties here we can demand intervention to stop people saving money and pocketing profit in the short term while destroying productivity in the long term - this we can legislate for. Those who neglect history are destined to repeat it. For me there are two major strands here. First is that to really properly develop tools using AI you need domain specific knowledge - not only in terms of both functional and technical specification (because in my experience AI misses this and can very easily run away with itself in some rather bizarre ways and totally forget the original specification, even when taking into account context window and memory degradation), but also in terms of the domain of development - you have to be able to properly and fully sense check outputs and assumptions made by the model. If you don't have the experience, you're going to miss not just major stuff, but the important nuances required in good development. There are definitely problems with how knowledge gaps will develop from short-term profiteering. Second is that I see it as something similar to the 1990s off-shoring of customer service call centres to cheaper locations to eventual great cost and a requirement for brands to re-onshore those services to keep customers happy, or at least provide decent escalation routes. Although it still does happen it was largely a failed endeavor. A lot of what is happening in AI is the same and I think it will bite back - I've already developed an allergy to those cheap horrible customer service bots/agents that never actually answer my question. But on a much larger and equal scale, I think the realisation might be something along the lines of the long term costs of off-shoring all our industrial and manufacturing facilities, knowledge and capabilities. In the UK we've done this in favour of financialisation and services and it's coming back to bite us now. I see this as being the fault of hailed people like Dyson who had a very patronising and blinkered view of offshoring back in the noughties. I remember listening to one of his speeches where he was ever so confident that off-shoring manufacturing to China was nice a clean in that it wouldn't involve any transfer of IP or high value knowledge, as it was only the low value stuff they'd get - oh how wrong he was. So the more we indiscriminately off-shore to AI the more we're going to create a rod for our own back. None of this is to say the AI is universally bad. I use it all the time and it helps me a great deal to get things done in a myriad of ways. Just need to know where your off-ramp is for when it merely gives you the impression of benefit. 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now