< Back to OSY 1.0 thread list

OSY 1.0 Thread Viewer

Thread #: 1023

I am selling my computer!

DuffMan

Thu Oct 4 18:45:12 2001

Yes, I've decided to sell my computer, well at least the motherboard and processor anyway. My P3 750 was never quite stable at 1ghz, and I am fed up with it. I'll be glad to get rid of this worthless P3, but I'll miss my motherboard dearly. It has a lot of sentimental value.

From the money I make selling it, I will be able to buy a much faster motherboard and cpu. Athlon XP and the new Celeron both look very promising. It looks like a return to the old days when celeron 300s@450 and celeron 366@550 were faster than the Pentium 2's and 3's available at that time. Tom's Hardware got a celeron 1200 to run at 1500, which would probably be faster than a 2ghz P4, and this is a < $100 processor.

In the mean time I have a K6 333 @ 375 and a cel 566 @ 850 to use, so I'll get by.

Riso

Thu Oct 4 18:52:43 2001


New Celereon = P!!! w/ Tualatin core (or somethin')

And sell the pos called K6 as well.

DuffMan

Thu Oct 4 19:13:27 2001

Yep, its pasically a .13 micron P3. They even have 256k of cache now.

Still 100mhz FSB which is bad, unless you overclock, in which case it is good.

Who would buy my K6. Anyway if I sell it, i'd have to put in a p200, which would just be slower. It's my Win2k Sever machine.

Imitation Gruel

Fri Oct 5 20:11:17 2001

A 1.2GHz PIII on a 100MHz FSB would be faster than the new Tualatin-cored Celeron because of superior cache associativity.

However, with good cooling it might be possible to get that 1.2/100 up to 1.6/133.

But I don't overclock; hence I'd probably go with an Asus TUSL2 and a Pentium III 1.26GHz/512K. That should be pretty close to a 2.0GHz P4, for less money (but with less upgrade path).

DuffMan

Fri Oct 5 21:28:48 2001

Naw, the new celeron has 256k of cache, and therefore the same associativity as a normal P3.

The old celeron had 4-way instead of 8-way because disabling half the cache cuts your associatvity in half. It really had the same associativity as a P3 relative to its amount of cache. At least that's how I understand it.

Riso

Fri Oct 5 21:44:29 2001

Theory: Old Celeron = P3's with a defective cache.
OscarWilde

Sat Oct 6 02:22:40 2001

Why not a duron instead of the overpriced intel chips?
AllYorBaseRBelong2Us

Sat Oct 6 20:00:42 2001

"Theory: Old Celeron = P3's with a defective cache. "

I'm not sure this was the case.

DuffMan

Sat Oct 6 20:17:53 2001

Celerons aren't generally overpriced. Though now I'm probably going to get an Athlon, mainly due to the fact that all Tulatin-supporting chipsets are crap.

815eb has crappy memory performance, still has that 512mb limit, which could be a problem down the road.

Via Apollo 133T also has crappy memory performace. As well as the general problems that a via chipset brings with it. There problems would be worth it, if I was actually getting good performance in exchange.

Via Apollo 266T memory bandwidth improves due to DDR but latency still sucks. Performace in real world apps turns out to be about the same as a BX board.

So unless either Via or SiS makes a better performing chipset, I'm not going intel. There's also the possibility that Powerleap will make a FCPGA to FCPGA2 converter, and I could run a Celeratin in an old BX board.

Most likely, though, I will get an athlon and go with Via's KT266A chipset. The "A" revision gives both good latency and bandwidth. Also you can get high quality DDR from crucial at only $33 for 256mb and free shipping.

AllYorBaseRBelong2Us

Sat Oct 6 20:23:46 2001

A converter for My Bx6r2 would absolutely rool!

I'll get another 2 years out of that machine at that rate.

DuffMan

Sat Oct 6 20:29:30 2001

Bx is and always has been teh win!
DuffMan

Sun Oct 7 05:51:15 2001

Ooooooh. [url]http://www.powerleap.com/Products/iP3T.htm[/url]

Now this will be a tough decision.

AllYorBaseRBelong2Us

Sun Oct 7 20:03:09 2001

As the Uncanny Mystic Grey would say, "This thread has been informative"
OscarWilde

Mon Oct 8 01:35:40 2001

what video card you gonna get?

I was reminded this weekend by a friend of mine who pointed out how sharp my display was. I'm assuming its because I have a DiamondTron (19inch) from mitsubishi and a Rage 128 pro AGP card set at 1280x1024 @ 85 hz that is making my destkop really sharp.
Which got me to thinking, am i right that its the combination that is giving me a sharp display. I've been staying away from the nVidia cards because i've heard to many people complain that the 2D quality is sub-par. I stayed away from the Radeon because its an overpriced piece of crap hardware which doesn't justify the cost of upgrading from the Rage 128 pro. Plus right now ATI isn't really doing good at all in the driver aspect, plus their latest patches seem to be causeing more problems.

I want a Radeon II though,
1) ATI cards have better 2D quality (well on the mac side at least).
2) The T&L unit on the Radeon II sounds killer
3) mpeg playback rocks on the ATI cards. I've compared mpeg performance and I don't know how or why, but the ATI cards are pretty impressive in the quality of their mpeg playback. (this is another thing my friend made some comments about).

Too bad 3dfx went down hill. Their 2D was pretty decent too.
Anyway, isn't there a geforce based card with a really good RAMDAC? I've heard someone mention that there is a manufacturer that uses the geforce chip but adds their own specialised 2D chip to improve the quality. That sounds like a good idea.

anway, what am i going on about?

AllYorBaseRBelong2Us

Mon Oct 8 02:05:54 2001

ATI cards have better 2D quality (well on the mac side at least).

My new Radeon All-in-Flounder has noticably better 2d than my Hercules MX (Noted as one of the best Mx's for 2d quality)

Most noticable is colour contrast and purity of colour.

So ATI is having shite driver problems on the Mac too?  Sounds like they are acting like a bunch of Low end OEM vending also-rans.

Hey wait.... :)

The T&L unit on the Radeon II sounds killer

Most of what ATI says sounds good on paper, I wonder if this wiz-bang new engine is going to give Nvidia a run.

DuffMan

Mon Oct 8 02:18:55 2001

I may keep my GF2MX card, but if I don't I might just get a Kyro by process of elimination. ATI's driver problems are unacceptable.

I think it's not really nvidia's 2d quality that's the problem, just the RAMDAC and the quality of filters the manufacturer decides to use. I know my card is pretty crummy in terms of image quality (2d or 3d). I've seen others which are better though.

AllYorBaseRBelong2Us

Mon Oct 8 02:21:51 2001

Ya know.

if Matrox made their own Nvidia based stuff with quality stuff and maybe mpeg/dvd decoding and possibly a TV tuner varient, they'd clean up the market.

OscarWilde

Mon Oct 8 02:53:08 2001

yep, ATI is NOT THE best at 2d quality on the PC side. You make a good point about Matrox. Pc users have a more choices in terms of getting a good 2d card with good driver support.
Only problem is that people are to concerned about 3d performance, for which the geforce cards are good at, but its only the hard core gamers that should really give a damn. Its sad that right now nVidia is such a big market force when it seems that their main focus is 3d performance. This is relative to the market size of nVidia to the market size of quake 3 players and the like.
Actually inspite of the fact that the Geforce based cards are good 3d performers the end "2d quality" is still pretty bad. The graphics are not as sharp and clean during the games either. The voodoo based cards ruled in this respect. ATI's 32bit 3d graphics is pretty damn fucking good too!

AYB, you're right about the specs of the Radeon 2. Question is how are those specs going to play out in practical use. Hence why i'm just going to take a wait and see approach, although i might just jump in and get a Radeon 2 card around the time its released.

Duffman, 2d quality is not only an issue of the RAMDAC, its also partly to do with the chip it self. Wouldn't it be strange a large majority of geforce based cards have crappy 2d quality? There is a market for people who want very good 2d quality. Granted it seems relegated to mac users who seem to put "elegance" over anything else. ;)

:(

I want a good all round graphics card. If it is going to be the Radeon2, i only hope that their 16bit 3d graphics problem has been fixed.

DuffMan

Mon Oct 8 07:00:39 2001

Nope, the output from the chip is all digital. It's only when it goes analogue that quality is a factor.

In fact it's not a matter of 2d quality, its a matter of image quality in general. Cheap RF filters cause colors to kind of "wash out" instead of having sharp borders.

I would bet a high quality GF card like an Elsa wouldn't have that kind of problem.

I'm typing right now on a 4mb Matrox card from god knows how long ago and it looks great, even on this crappy monitor. The problem is that all the nvidia card makers are in a price war with each other and will sacrifice quality to cut costs a little. Hardly anyone has any brand name loyalty to card makers, it's all to nvidia and a lot of the business goes to who can make em the cheapest.

Riso

Mon Oct 8 10:20:07 2001

Repeat:
Crapidia is teh evil.

I shall not buy cards made by the evil marketing FUD company nvidia.

OscarWilde

Mon Oct 8 15:54:01 2001

Nope, the output from the chip is all digital. It's only when it goes analogue that quality is a factor.

Huh? I'm not an expert but i think there is an oversimplification there. I'll have to do some research into this.  

In fact it's not a matter of 2d quality, its a matter of image quality in general. Cheap RF filters cause colors to kind of "wash out" instead of having sharp borders.

I was thinking more along the lines of how the graphics card deal with mixing colors and values. There are cheap methods and expensive methods. Cheap ones are the ones that could mean trade offs on the chip level. i.e. use a part of the silicon that wasn't specifically designed to do the job but can be used to give a close aproximation kinda deal. Thats my guess.

I would bet a high quality GF card like an Elsa wouldn't have that kind of problem.

Thats whay I mean though. Mac users are complaining about the Elsa cards too. And if i remember correctly Elsa is the manufacture of the Mac version cards too, i.e. the ones you find in powermacs are made by Elsa. I'll have to recheck on that though.


I'm typing right now on a 4mb Matrox card from god knows how long ago and it looks great, even on this crappy monitor.

Yeah i remember how at one time every card manf. would harp on about thier 2d quality and so on. At the time that was the focus.
Even when 3d started to become the main focus 2d quality was still good.
Shit went down hill when nVidia came along. Or thats the impression i get from a lot of sites that do video card reviews.

The problem is that all the nvidia card makers are in a price war with each other and will sacrifice quality to cut costs a little.

Which is teh suck, but partly due to also the design of the chip it self.

Hardly anyone has any brand name loyalty to card makers, it's all to nvidia and a lot of the business goes to who can make em the cheapest.

I still think people buy on brand loyality. It just that the market has become to cut throat and aggressive. Plus the problem lies in a very uneducated majority, with to many who have over night become "experts" in everything because they figured out how to connect to the internet all by them selves. :rolleyes:

Today i had some fucking engineer tell me that windows 2000 is a bad mistake and that i should look into windows 98!!!!

AH FUCKER!!!!!

Anyway i told him that we're a business and that either TI (yes thats Texas Instruments) works within our enviroment or they jack off. :biggrin: Fuck, i hate when i have meetings with "so called comptuer experts", especially when they are really just limited to one field. Just cause you're an electrical engineer doesn't mean you know shit about Windows. :tongue"
sorry had to vent.
I told the guy to get in touch with his software engineer and find out if they had a windows NT compatible software or re-work the code and re-compile. Anyway I can't go into to much detail about whats going on.