oops replies all off topic
oops replies all off topic
Hotplugging has nothing with PCI to do (unless you have a multi-million dollar server which has hotplug PCI...).
Try and solve it with a different approach. If you run lspci, what do you get?
I can get you a hotswap pci server for $10,000 :D
Oh, really? That cheap? =)
I must say that it would be really fun, though. I hate losing my uptime just because of hardware modification. Do you think that you could modify a normal PCI implementation to get a hotpluggable computer, or would that _require_ chip modification? It would make a truly interesting project if it's possible with home appliances. Also, does anyone know if there are cheap IDE cards out there with hotplug support?
Also, although it really isn't related to this, do you know when the I2C bus was implemented on motherboards? I'm doing a whole lot of hardware projects, and using the I2C bus is interesting since, you only have one parallell port and serial port, and USB chips are too expensive. Although I could implement an I2C bus on the parallell port or make a RS232 to I2C adapter with a PIC uC, it would be so much nicer to use existing functionality.
Sun Fire 3800 server has hot-swap pci so you don't have to suspend the bus to pull the card out, now if I can only get $10,000. As for an ata controller, only one I have used was the adaptec 2400a which isn't cheap though($300), but I don't think you will be able to find a cheap one unless you search ebay for an older model card. I doubt you can hack a normal pci to support hot-plug. I believe you need a seperate hot-plug controller for the support. Though if you are some godly engineer, I guess that could be added to an existing pci bus since nothing else changes. Also you will need to write hot-plug drivers for the cards and the os. You could always get the specs from pci sig and build your own.
About the I2C bus, I guess late 80's. If I would of paid more attention in my hardware classes, I am sure I would know the exact year :) What are you building?
Late 80's? Are you kidding me? I thought it would be the late 90's. I'm not even sure if my 166 MHz Dell servers have an I2C implementation. (Note: I'm not talking about when it was invented, but when it became common on motherboards) Anyway, I just asked Dell support, so I'll find out sooner or later.
I doubt the PCI chipset would be able to handle hotplugging without freaking out, but maybe I'll check it out. And trust me, I'm not a godly engineer. Like with everything else, I'm learning it by myself as it comes my way.
I'm really doing pretty simple hardware things, but that are useful (or just cool =) ) in some ways or others. Right now, I'm building a light dimmer for my room to be controlled by the computer. Since the digital potentiometer chips I got handle I2C natively, I thought it would be nice to use it. Also, I and a friend are experimenting with combining cheap LCD displays with Microchip's PIC uCs to create a portable terminal emulator. The nice thing will be that it will be driven by the COM port's DTR line, so you'll just plug it in and it's ready. I'll use it to output data from my current computer-driven caller-id system. Right now that uses speech synthesis to say who's calling... it's really nifty. We have a couple of more projects in store, too, but nothing current. Some day I'm hoping to interconnect some home appliances with my computers with a P2P PIC-driven IR network, but that's really a future project.
I just guessed late 80's because I knew phillips developed it in early 80's to be used in tv's, ect and how widely it is used today. Figured motherboard designers would of been using it by like 88-89. Good luck with dell support answering that one. I haven't had much luck trying to get help when my laptop has problems. Their fix is always, send it in and we will replace it. I never really got into hardware that much but I do like playing with fiber optics.