• pelya@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 hours ago

    A real-world optical chip that you can actually buy is exciting. Still, seems to be far from a consumer-grade optical CPU. It’s more like a microcontroller, which you stick at the end of your 10 GBit fiber optic cable, and receive processed optic data.

    Memory is going to be a big problem, because any AI workload requires a ton of it, and replacing even a simple 16 GB DRAM chip with an all-optic equivalent means you are essentially creating 16 GB of L1 CPU cache, which would be like 100 server CPUs stacked together, used only for their cache memory. And if you are using a conventional DRAM, you need to introduce optic-to-electric converter, which will be a speed bottleneck of your system, and probably expensive.

  • owenfromcanada@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 hours ago

    I’ve wondered for years when we’d reach this point. Optics-based processors have the potential to blow way past the limitations of electrical/copper circuits, at least in theory. I’m curious to see where this leads.