- cross-posted to:
- linux@lemmy.ml
- cross-posted to:
- linux@lemmy.ml
Coretemp and Ethernet. Also a few years ago the guy that maintained meshcentral (the only reason to pay extra $$$ for having Intel vPro compatibile computers in the workspace)
Basically this tells their biggest customers “next server needs to be based on AMD epyc”
How much money they could possibly “save” with those THREE salaries? Just cut one week of travel with private jet for the C class and the same savings are served
But how else is the CEO going to cheat on his wife? Cold play concerts are def out of the picture now.
So, their chips become unsuitable for enterprise servers. Datacenters avoiding them and buying AMD. Intel losing enterprise market share and revenue. Reduced revenue causes next layoffs, probably again people working on things that keep the business working. Shoots itself in the foot and being surprised about the consequences.
And AMD becoming a monopoly, nice, nice world
Monopolies are good for the consumer as it makes purchasing decisions easier. Some tech markets such as the GPU one show how well a monopoly can work for shareholders.
Am i being wooshed?
Monopolies are good for the consumer?
Respectfully, what the fuck are you talking about. That has never been true ever.
Based on the last sentence, I believe it’s satire.
I thought the second sentence made that clear but apparently not.
If only there was an understood way to convey sarcasm.
There is /s but I opted not to use it. Using it makes the sarcasm less “fun”/rewarding for me.
I am quite surprised at the amount of people that think people exist that would praise a monopoly and celebrate shareholder value. Sadly people thinking that means that they have encountered people who hold such beliefs seriously. Which is quite sad.
How did you get that downvote? Must have been a mistake.
Yeah, wouldn’t it be amazing if for example Apple has the monopoly on the smartphone market, so your purchasing decision would be to buy an iPhone, or a slightly larger iPhone? And they would have no competition - which is the definition of a monopoly - so they could price them at whatever they wanted to, they could even make the American iPhone a reality, because let’s be real, it’s kinda hard to function without a phone these days so who cares if it costs $5k, you can just sell a kidney, right? You got two of 'em.
Monopolies are such a great thing for consumers :)
(Please don’t sell a kidney for an iPhone, it’s a really bad decision.)
That is what the tariffs are for and Apple has promised to invest a shit ton of money into manufacturing those phones here… so they can raise the prices to the ceiling of the tariffs imposed on China, India, Thailand, Indonesia … etc. There are no other smart phones manufactured in the US so they will effectively… have a monopoly.
That story is really sad. To me it seems like a failing of the parents/education system to not teach the son who was 17 that selling a kidney is a bad idea.
Decisions easier, quality and sanity suffers, though.
When I got a new desktop PC this year I specifically avoided anything with Intel in it because of how bad they dropped the ball with their GPUs basically disintegrating.
This is just a small glimpse into how Intel is breaking down from the inside. It may take a few years but if the US government doesn’t intervene somehow on their behalf I truly think Intel might be done for in the next 5 years.
Same. Intel and Nvidia are both on the boycott list.
As great as AMD is right now, I still don’t want them to become a monopoly. The fact that we have a duopoly is already a major problem.
Maybe we can dig up VIA and get a new Cyrix CPU.
Imagine if x86-64 got blown open because of it? Might literally be the best thing to happen to computing in like 40 years.
Really fuckin’ doubt it’ll happen, but a girl can dream XP
Or, what if it just became irrelevant. It’s had a great run. But honestly ARM has shown plenty of versatility and power. While being licensable unlike x86. And things like riscv have similar of not better potential.
Its always going to be relevant, even if only emulated, simply because of how many code bases are stuck on x86/x86-64.
Open sourcing it and all of its extensions solves the licensing problems of not only itself, but Arm, while providing a battle tested architecture with decades of maturity.
Also imagine the fun FPGA consoles could have with that?
Oh I have no issues with it being relevant in the same sense the Z80 68k or 6502 still being relevant. Just not part of a controlling duopoly.
Thank you. Thank you for giving me hope lol
IMO, Intel is circling the drain and will die without intervention. And their death will have some pretty crazy ramifications.
If the US had competent leaders, they’d realize Intel was important to global security, and they’d come up with some sort of way to break up the fab and design business.
No one wants to send their designs to Intel’s fab because they don’t want Intel to copy their homework. That’s why Intel’s design competitors use TSMC. And TSMC scales faster because of increased money and experience.
Trump’s 100% tariffs on chips made outside the USA is puzzling. It it an attempt to force Intel, who do make chips in the USA, to become more competitive just through bullying everyone? Or does he know it will just cause more trouble and is he trying to drive Intel into the ground for revenge because they took Biden’s money? Why is he also demanding that Intel’s CEO resign? Does none of it make sense because Trump is a crazy old narcissist who has lost touch with reality and is now losing his mind?
there is no real rationale. Trump is all impulse, no long range thinking/planning.
The tariff thing just shows that Trump doesn’t understand why people use TSMC. TSMC doesn’t have a brand of chips that they sell, and they can’t copy your designs.
Companies don’t manufacture with Intel because Intel isn’t just their manufacturer, it’s their competitor. Also, Intel’s fab is now behind the curve. It literally can’t manufacture some of the shit Apple and Nvidia want.
Trump sees a rash and is prescribing cortisone cream. But the skin irritation is from melanoma.
Trump loooves to take action. Coherent plan or direction is irrelevant.
Good luck US, still some to go.
Right? Good Luck America. FAFO.
Point 3 of Umberto Eco’s traits of ur-fascism.
Irrationalism also depends on the cult of action for action’s sake. Action being beautiful in itself, it must be taken before, or without, any previous reflection. Thinking is a form of emasculation. Therefore culture is suspect insofar as it is identified with critical attitudes. Distrust of the intellectual world has always been a symptom of Ur-Fascism, from Goering’s alleged statement (“When I hear talk of culture I reach for my gun”) to the frequent use of such expressions as “degenerate intellectuals,” “eggheads,” “effete snobs,” “universities are a nest of reds.” The official Fascist intellectuals were mainly engaged in attacking modern culture and the liberal intelligentsia for having betrayed traditional values.
There’s no way politicians will let one of the most important chip manufacturers die. If push comes to shove, they’ll get subsidies
or they will just move overseas.
Probably not, but these morons will probably wait until too much damage is done. They’re shortsighted AF.
Didn’t the orange one threaten tsmc to buy 49% of Intel or will get higher tariffs?
… Have you seen the competence of the politicians on display in the US right now?
sure, but if the us doesn’t china will.
China will subsidize Intel?
Intel can go anywhere. To the highest bidder and the lowest cost.
States can control who companies do business with, and also who buys and sells them.
I’d be surprised if the US let any foreign actor buy Intel, and pretty sure specifically US chip companies (production, design, and IP) are barred from doing most kinds of business with China.
Another reason to go for Amd
I’ve been a gushing fanboy since I had a discount $200 laptop that ran mass effect 3 with an integrated GPU.
So just stick with what I’ve been doing and avoid Intel? Got it.
AMD all the way baby!
Sure, but in the meantime I need to work with what I have… which is Intel (on some machines, at least).
Nothing wrong with that, but when given the choice… I’ll go AMD. I think I bought an i5 one time only in my life and I’m old.
Without Intel processors, Linux wouldn’t have been possible in the first place.
But today we have good processors from many different manufacturers. The Linux community must, and can, stay alive even without the support of one major player.
We don’t have that many other processors, though. If you look at the desktop, there is AMD and there is Apple silicon which is restricted to Apple products. And then there is nothing. If Intel goes under ground, AMD might become next Intel. It’s time (for EU) to invest heavily into RISC-V, the entire stack.
ARMs are coming. RISCV are coming. Some Chinese brands have been seen, too.
Neither are commonly available in desktop form factors and they usually require custom builds for each board to work.
And for many x86 will remain an important architecture for a long time
ARMs are more oriented towards servers and mobile devices for now. Sure, we saw Apple demonstrating desktop use but not much is there for desktops for now. RISC-V is far away, Chinese CPUs are not competitive. It’s coming doesn’t help in short term, questionable in mid term. 🤷♂️ Yes, alternatives will come eventually, but it takes a lot of time and resources.
There is ARM also found on apple,raspberry pi,Orange Pi but those are SBCS(except apple) they can always be turned into normal laptops and desktops and such.
The only problem with ARM its a closed ISA like X64.
The only Problem with both ARM AND RISC-V They are RISC not CISC like x64 so better power consumption with lower clock speeds, bad for desktop great for laptops and such.
Thanks for coming to my ted talk.RISC is perfectly good for desktops as demonstrated by Apple. Microcontroller chips are suitable for light desktop tasks, they are nowhere near modern x64 CPUs. For now.
It doesn’t really make much of a difference on modern CPUs as instructions are broken down into RISC-like instructions even on CISC CPUs before being processed to make pipelining more effective.
This is the correct answer. Modern x86 (x64) is a RISC CPU with a decoder that can decode a cisc isa.
From what I remember one of problems with CISC is that it has variable length instructions and these are harder to predict since you have to analyze all instructions up to the current one wheres for RISC you exactly know where is each instruction in memory/cache.
one of problems with CISC is that it has variable length instructions
RISC systems also have variable length instructions, they’re just a bit stricter with the implementation that alleviates a lot of the issues (ARM instructions are always either 16-bits or 32-bits, while RISC-V is always a multiple of 16-bits and self-describing, similar to UTF-8)
Edit: Oh, and ARM further restricts instruction length based on a CPU flag, so you can’t mix and match at an instruction level. It’s always one or the other, or it’s invalid.
I was thinking about Apple’s M CPUs that have fixed length and they benefit out of it. It was explained on Anandtech years ago, here is a brief paragraph on the topic. Sadly Anandtech article(s) isn’t available anymore.
Since this type of chip has a fixed instruction length, it becomes simple to load a large number of instructions and explore opportunities to execute operations in parallel. This is what’s called out-of-order execution, as explained by Anandtech in a highly technical analysis of the M1. Since complex CISC instructions can access memory before completing an operation, executing instructions in parallel becomes more difficult in contrast to the simpler RISC instructions.
This isn’t completely true. Even a basic instruction like ADD has multiple implementations depending on the memory sources.
For example, if the memory operand is in RAM, then the ADD needs to be decoded to include a fetch before the actual addition. RISC doesn’t change that fact.
Yes, but RISC knows the exact position of that instruction in cache and how many instructions fit the instructions cache or pipeline. Like you said, it doesn’t help with data cache.
alr thanks for the info
Why did Linux need Intel processors specifically?
The PC was new. There were only Intels in PCs. Linux was made for the PC.
Backstory: Prof. Tanenbaum was teaching operating systems. His example was MINIX (his own academic example). This motivated one student to try to make a new operating system for PCs, doing some things like the professor, and other things quite differently. This student knew the specifics of the Intels and used them good for performance etc.
Sure, but if Intel hadn’t made the 8086 and that entire family line was severed, Linux would have just been made for Motorola 68000 series or something. Or one of the Acorn ARM chips that did the rounds at the time.
Oh man, you just unlocked memories of my Mom’s 8086 back in the late 80s… her first pc was an Apple but the software… there was more for the Intel. I remember so much disk inserting and printer interface issues and the DOS. We had like boxes and binders of those huge discs…
Mostly nontechnical person here: how much active maintenance does this driver need? To the uninitiated, it sounds like it should be basic and standard.
Not much, but it does need to be maintained. Every time someone pushes an update to code that the driver uses, something changes in the Linux kernel, or Intel releases be hardware that needs a different register map or whatever, the driver will fail. If nobody steps up to maintain it, it could stop working in a matter of months.
Okay, thanks!
This doesn’t sound like a 40-hour per week kind of a job
I doubt maintaining a single deriver was their only responsibility.
Hey this is kind of interesting since I just met up with my friend who works for Intel today for his kids first birthday and he was telling me about this issue and how they’re trying to get him to be part of a related team (not specifically related to Linux) on top of his other responsibilities…
He went on at some length describing how absolutely absurd the whole structure was of related systems and how it’s a miracle any of it works lol
it’s a miracle any of it works
After 25 years in software development, I can say that’s how I feel as well.
I’m a network engineer and lately I’ve dived deep on wifi. I feel the same way about wifi.
Man, wifi is black magic. Not the nice kind that draws kittens out of hats, but that one that need a blood sacrifice to work
Since I had comp architecture in undergrad I find it a miracle that any of it works.
Stop buying Intel products, got it thanks!