• 0 Posts
  • 173 Comments
Joined 2 years ago
cake
Cake day: June 15th, 2023

help-circle
  • If you know not only what a torque wrench is, but how to use it properly you will likely have no trouble changing brake pads.

    The feeling I get is that auto work goes much much deeper though, and I am interested in resources that offer that knowledge.

    Full engine rebuilds, or even troubleshooting intermittent CANbus issues, sure. But basic maintenance like brake pads or changing out a failed alternator just require basic hand tools and some minor knowledge you can get from youtube.


  • Replacing brake pads (not shoes for drum brakes) is a fairly straight forward activity and possibly one of the best (besides perhaps changing engine oil) to perform yourself. Youtube is a great place to start. You can likely even find a full video of pad replacement for your exact model of car.

    What is your current knowledge with using basic hand tools such as screwdrivers, hammers, and wrenches (for hex head fasteners)? Do you know how to replace a flat tire? There’s lots of overlap with that procedure and changing brake pads.


  • How much of the hardware and software you use must be registered, requires internet access to work, a proprietary app? You don’t actually own anything that fits in those categories and they can be taken away from you at the manufacturer’s whim.

    While there are certainly commercial versions of those that fall into those categories, there are many that don’t.

    • My 3D printer from Monoprice has a power plug and and SD card slot. No requirement to connect it to the internet at all for it to function.
    • Here are 7 FOSS CAD software packages that aren’t own by any company.
    • There are countless NFC and Wifi modules that don’t require a “call home” to the vendor that can’t be turned off. Cell modems may be a special case because you’re using a providers network.
    • There are lots of large format printers that, once the drivers are install, need zero network connections to operation. No vendor shutoff possible unless you allow it.
    • Same for CNC machine. Certainly at the high end industrial scale this may be different, but there are many of solutions for home and commercial users that don’t require an always-on connection.


  • Yes, you can definitely do all those things, but they’re far outside of the realm of a normal consumer, and unless you know, to look for those things. It’s a lot harder to find.

    I’m confused by your premise then. If you’re saying “Today’s consumer electronics can’t be tinkered because they require specialized knowledge.” I’d argue that was always the case. How much tinkering could be done to an Atari 2600 from 1977?

    How much tinkering would be done to a VHS VCR from 1989 without specialized knowledge?

    These are prime examples of prior generations of consumer electronics.


  • Thought of this the other day. I bet a lot of us are like this, because in today’s world a lot of things we used to tinker with are gone (electronics are made to be single use and unfixable, cars are proprietary and can rarely be modified or worked on without many many thousands of dollars now, etc).

    I feel the exact opposite. Today I can tinker in ways I never ever could before for two reasons:

    1. so many more technological solutions exist
    • 3D printing
    • CAD
    • wireless (near field, Wifi, and cell network)
    • large format printers for paper, vinyl, and fabric
    • CNC for wood and metal cutting
    1. components are so cheap relative to the past
    • single board computers (Arduino, ESP32, RaspberryPi, etc)
    • high quality optics and CCD cameras
    • mountains of cheap storage
    • small and large LCD displays, eink

    When I started out the cheapest computer was today’s equivalent of about $2000. To be able to buy a whole computer in a Raspberry Pi zero for $10 is insanely awesome! Electronic components from Radio Shack were few and very expensive. Test equipment like oscilloscopes were simply out-of-reach financially. Now I have a handheld one I bought for $200.

    This is an amazing time to be alive with tinkering!







  • Have you ever compared systems from back then on how well they actually worked? For sure the PC was awful. AND MS-DOS was the worst OS in existence at the time.

    Yes, I lived through that period and have firsthand experience.

    Most of those architectures you mention were workstation, server, or mainframe class

    No

    I think you missed the part of my post where I called out PPC 601 and Moto 68000 in desktops. PPC was also in workstation and server grade machines including IBM iSeries Midrange systems.

    I don’t understand how you can argue a point that X86 was ever any good, have you ever tried programming assembly on it and on any of the competitors?

    You’re still arguing technical superiority, when that isn’t the primary factor for folks that bought computers. Consumers didn’t want to throw away their entire computer and software library when going to the next iteration of a company’s product. PC Clones made PC computing affordable. Commodore with its Amiga fought against its only clone Atari ST. Apple quickly squashed any Mac clone makers. These companies got greedy because they wanted to sell hardware at a premium price and control their entire ecosystems, just like they before on prior platforms. They starved their pipeline of younger/poorer customers that would eventually be able to afford the premium products. PC had no such issue and won the computing war of the 80s and 90s.


  • From the little that I have read, their MT performance (or even TCO) isn’t really as great as some of the early previews would lead one to believe…

    TCO is “total cost of ownership” a very important piece to that in the future is power consumption. Energy prices are rising. This isn’t just the electricity consumed by the CPU but also the cooling needed to exhaust the heat. Many of these highest performance x86 CPUs will cost substantially more to operate as the energy prices continue to rise.

    I just don’t see ARM being a universal silver bullet (a straight line upgrade from x86)

    Its not there yet, but with Intel fumbling on this one, leaving AMD the only leader in the space, trading one company being dominate over the other doesn’t really serve us well. What I’m pointing out is that its not a “straight line” upgrade, but its curving ever more toward a non-x86 future.

    and with SoftBank trying to extract more cash out of ARM, things could get interesting.

    I agree which is why I keep making references to RISC-V where I think the future will likely go instead. However, ARM showed (the industry as a whole) that we don’t need to stay with x86 forever as was the notion before. As in, “if we’ve successfully shown we can replace x86 with ARM, what would prevent us from replacing ARM with something else? Not much”.




  • I think you’re looking at it from a pure technological view, but that is only half equation.

    No actually it never was. It was always a clumsy mess. The only reason IBM picked the X86, was because Intel also made the cut down i8088, that only had an 8 bit data-bus, which made the system easier and cheaper to make.

    The “cheaper to make” was the part that made it pretty good for its day.

    Market penetration and ubiquity were key factors in the overall advancement of computing around the world. The explosion of progress occurred when there was mostly one computing architecture, and that writing software for it would mean a huge market with a long life. Most importantly, long enough to make back your initial investment and earn healthy profit.

    The modest Arm with a tenth the transistors was 4-5 times faster than a full fledged 33 Mhz fully 32 bit 80386DX!

    And with that performance advantage, why is it x86 continued to advance selling more and more units eventually becoming the standard for desktop and server computing? Market penetration.

    Back then hardware and software ecosystems were closed. You could learn on a Wang, but that made you useless on VAX. Your SunOS on Sparc knowledge wouldn’t help you very much on Silicon Graphics IRIX on MIPS.

    Contrast that with your DOS knowledge on IBM 5150 was almost identical Compaq Deskpro.

    Today X86 is considered pretty good on the desktop, because all the competition has disappeared. Alpha, Motorola, Sparc, MIPS, PowerPC. X86 was never very good compared to any of those.

    Most of those architectures you mention were workstation, server, or mainframe class CPUs and not desktop. Again, from a purely technical view, sure, they were better, but how good is a CPU that you can never afford to buy?

    Even the Motorola (68000 series) and later the PowerPC (for desktops 601 etc) were only in computers that were far more expensive than their equivalent x86 counterparts. It wasn’t for a lack of computing power, but rather those brands wanted exclusive control of their hardware and would crush any attempt to make clones lowering the pricepoint. That did NOT serve the end users or the market, which is largely why I think they failed.

    We got the worst OS with MS-Dos and later Windows, and we got the worst architecture with X86.

    We got a single CPU architecture and OS compatibility for almost 40 years. If we hadn’t, we would have taken much longer to evolve to where we are today of being able to change out the underlying CPU with lighter weight changes for OS support. Today Linux will run on nearly every CPU architecture including the common x86, ARM, and now even RISC-V. It would have been a much longer path had we had multiple dominant computing architectures all vying for resources.

    I remember standing in front of a wall of boxed video games sorting through them, getting excited to see a title, only to see it wasn’t for my platform. Tandy, Apple II, Atari, TI, Commodore, and all the various iterations in between! A game written for Commodore PET couldn’t run on Commodore VIC-20, and the VIC-20 game couldn’t run on the Commodore 64. X86 changed all that. The same game that ran on the 8088 could run on the 286, 386, 486, Pentium, etc. We needed all of that to get where we are today.





  • I’m not positive, but I think OP is possibly extremely poorly communicating their position.

    I think @AllegraGory@sh.itjust.works means:

    “If a creator of original content posts their content online, subsequent posters shouldn’t capture/crop/alter that work then upload it elsewhere robbing the original content creator of attribution or the monetary gains they might received from hosting it on their original site. Instead, what subsequent posters should do is link to the original content at the original URL leaving the creator in full control of the hosting and perhaps any revenue they would receive from their work”

    I’m not taking a position either way on this with my post merely translating what I think OP is actually trying to say. OP, please correct me if I’m wrong.