![Sad :(](./images/smilies/icon_sad.gif)
oh yeah, another thing I noticed, Phoenix blobs don't animate. The center of the Plasma blobs don't look as bright as they did in D1, but this is a D2-specific palette thing, so it isn't really worth fixing for now.
Moderators: Grendel, Aus-RED-5
Exactly what I'm referring to. Performance is based on the number of vertices transformed. Even with the clipping, there are an aweful lot of verts being transformed.pATCheS wrote:I think he means that all of the vertices for each face are sent to the graphics card, as opposed to using something such as a triangle strip, where the previous two verts are used for the next triangle (this saves gobs of memory bandwidth in talking to the hardware through the gfx driver).
pATCheS wrote:Of course, if you wanted to be real fancy about it, you could use the accepted chromatic intensities (I think green is weighted highest at around .5, red around .3, and blue around .2). Dunno how three float multiplies would compare speed-wise to three integer sums and a divide though. Not that it matters, since it'd only be done once per texture load anyway.Code: Select all
//Do this for each texel in an explosion texture for(int x=0, x<w; x++) { for(int y=0, y<h; y++) { pixels[x][y].alpha = (pixels[x][y].red + pixels[x][y].green + pixels[x][y].blue) / 3; } }
Code: Select all
typedef struct tPixel {
unsigned char red, green, blue, alpha;
} tPixel;
for (int i = w*h, tPixel *p = pixels; i; i--, p++)
p->alpha = (p->red * 3 + p->green * 5 + p->blue * 2) / 10;
OpenGL can use the bitmap as the alpha. I think It's just a matter of telling it to do it. I'll look into it and get back to you.Diedel wrote: The problem is that currently I simply hand the bitmap and a global alpha value to the OpenGL system ...... no bit-wise rendering done here. D2 bitmaps are palettized, one byte per pixel bitmaps. I would need to convert them to RGBA bitmaps, decoding each palette color to its nearest RGB equivalent, and have that one rendered ... phew.
Are you modifying the actual alpha of the image? I don't think that is necessary as it can do it when it renders.Diedel wrote:patches,
I have modified D2X so that it can have a texture's pixels' alpha calculated from the pixel colors, but the result are rather opaque explosions, so I am still experimenting with this.
Code: Select all
<frame #> = <frame count> - ((<frame count - 1) * <time to live left> / <total life time>) - 1
Code: Select all
<frame #> = (<total life time> - <time to live left> % <total life time>) / <frame time>
Windows XP Professional, ATI Radeon 9800 Pro, latest Omega Drivers. ASUS A7N8X Deluxe with onboard sound. AMD Athlon XP 2000+. 1GB PC3200 RAM (UNDERCLOCKED by the 2000+ being so slow in FSB, I might add --- so its exceptionally stable). Heck, it did this on a different video card too. It's not my configuration, and no viruses have infected NTDLL.DLL.Diedel wrote:Giving some details like OS, gfx & sound hardware, driver versions would have been a good idea.Gregster2k wrote:Every version of D2X-W32 I've ever used has Crashed to Desktop on startup with an NTDLL.DLL error message.
I have D2X-W32 running on WinXP pro & home, Win2K pro with Radeon 9800 pro, X800 XT, GF 5200, GF2 400 MX, Soundblaster onboard, Soundblaster Audigy w/o problems.