D2X-XL Bug Reports - MS Windows
Moderators: Grendel, Aus-RED-5
oh dear... The UI problem seems intermittent I just tried to reproduce it again, and it's not happening. The worst I can get it to do at the moment is not display briefing text the first time briefings are displayed. The other one-frame briefing problem remains. Fun stuff, isn't it? +_+
oh yeah, another thing I noticed, Phoenix blobs don't animate. The center of the Plasma blobs don't look as bright as they did in D1, but this is a D2-specific palette thing, so it isn't really worth fixing for now.
oh yeah, another thing I noticed, Phoenix blobs don't animate. The center of the Plasma blobs don't look as bright as they did in D1, but this is a D2-specific palette thing, so it isn't really worth fixing for now.
Exactly what I'm referring to. Performance is based on the number of vertices transformed. Even with the clipping, there are an aweful lot of verts being transformed.pATCheS wrote:I think he means that all of the vertices for each face are sent to the graphics card, as opposed to using something such as a triangle strip, where the previous two verts are used for the next triangle (this saves gobs of memory bandwidth in talking to the hardware through the gfx driver).
- De Rigueur
- DBB Admiral
- Posts: 1189
- Joined: Wed Jun 06, 2001 2:01 am
- Location: Rural Mississippi, USA
I'm having some mouse problems.
In the config screens for mouse and keyboard, I can't use the mouse to select the boxes to modify (I can use the arrow keys and enter keys to do this.) With the mouse, I can move the cursor to about an inch below the box I want to select and left-clicking will highlight the box, but double-clicking won't put the "?" in the box. BTW, the mouse works fine in gameplay.
Also, I can't configure the use of the middle mouse button or the mouse wheel. They are just not recognized when I try to assign them to a function.
I have logitech mx510 and nostromo n52 w/ latest drivers.
The version of d2x iis 1.3.40
Thanks for you help.
In the config screens for mouse and keyboard, I can't use the mouse to select the boxes to modify (I can use the arrow keys and enter keys to do this.) With the mouse, I can move the cursor to about an inch below the box I want to select and left-clicking will highlight the box, but double-clicking won't put the "?" in the box. BTW, the mouse works fine in gameplay.
Also, I can't configure the use of the middle mouse button or the mouse wheel. They are just not recognized when I try to assign them to a function.
I have logitech mx510 and nostromo n52 w/ latest drivers.
The version of d2x iis 1.3.40
Thanks for you help.
patches,
- I will look into the wantmip thingie.
I will also work over transparent explosion stuff. I will however definitely not manipulate explosion effects pixel-wise.
Phoenix blobs were animated in D2?
I have made D2X to render Plasma blobs transparently, so they're not as bright as they used to be. I like the effect though.
Re briefing movies: Here at home I have a frame flashing around each robot at the 1st frame - did not happen at my workplace machine. Weird. The palette problem is gone though. The frame stems from a fullscreen variable that needs to be set but obviously only being set after the 1st frame. Movies not played in fullscreen have a frame around them to have them look better in higher resolutions than 640x480.
Re: Disappearing menus: This might have something to do with blending bitmaps. I really cannot tell why this sometimes happens. Actually what happens is that D2X creates the menu, prints the text, and then blits the background over it. Don't ask me why.
Re vertices: D2 supports 900 segments with each having 6 sides consisting of max. 2 faces. That makes 48 vertices per segment, totalling approx. 45,000 vertices per frame. Don't tell me that's a lot. Btw, a level has quite a few open sides, so there are less vertices to be rendered, as those segment sides aren't rendered. Then, D2 excludes a lot of segments from rendering with a kind of hidden surface (rather: hidden segment) algorithm.
If I don't cap frame rate at all in D2X-W32, I get about 850 - 1050 fps (that with some code that effectively creates a frame capping effect, too). Ok ok, I don't have a GF 5200 or Radeon 9250 at home.
- D'oh you found it out. For some weird reason mouse navigation in config menus doesn't work. You should however be able to assign the middle button/wheel to a game function. I am using it for the afterburner.
pATCheS wrote:Of course, if you wanted to be real fancy about it, you could use the accepted chromatic intensities (I think green is weighted highest at around .5, red around .3, and blue around .2). Dunno how three float multiplies would compare speed-wise to three integer sums and a divide though. Not that it matters, since it'd only be done once per texture load anyway.Code: Select all
//Do this for each texel in an explosion texture for(int x=0, x<w; x++) { for(int y=0, y<h; y++) { pixels[x][y].alpha = (pixels[x][y].red + pixels[x][y].green + pixels[x][y].blue) / 3; } }
Code: Select all
typedef struct tPixel {
unsigned char red, green, blue, alpha;
} tPixel;
for (int i = w*h, tPixel *p = pixels; i; i--, p++)
p->alpha = (p->red * 3 + p->green * 5 + p->blue * 2) / 10;
OpenGL can use the bitmap as the alpha. I think It's just a matter of telling it to do it. I'll look into it and get back to you.Diedel wrote: The problem is that currently I simply hand the bitmap and a global alpha value to the OpenGL system ... ... no bit-wise rendering done here. D2 bitmaps are palettized, one byte per pixel bitmaps. I would need to convert them to RGBA bitmaps, decoding each palette color to its nearest RGB equivalent, and have that one rendered ... phew.
"I would need to convert them to RGBA bitmaps, decoding each palette color to its nearest RGB equivalent, and have that one rendered ... phew."
The OGL wrapper already does that You wouldn't be able to get smooth bilinear filtering or true color lighting with a palette of 256 colors. The function that performs the conversion is ./arch/ogl/ogl.c ogl_filltexbuf(), and it's only called by ogl_loadtexture() in the same file (the first instance is in a #if 0 block which might as well be taken out). The trick is figuring out which textures need to have the additional processing done on them. You know a lot more about how textures are loaded, so maybe you could create a path of bools along the call tree to say whether or not to put in the alpha information. If you want to at least see what it looks like, have it exclude 64x64 textures so you can get a fair idea. It does look really nice. From what you're saying, I guess you're not messing around much in the OGL side of things. It's probably not worth worrying about putting it in now, though, since Lehm is rewriting the whole mess anyway (or is he? I might've misinterpreted something). Hopefully his version will make it easier to implement. *wink wink nudge nudge*
"Phoenix blobs were animated in D2?"
There's three frames of Phoenix blob in DTX2, I see no reason why they shouldn't be animated, even if they aren't in the original D2
eh. My hour long lunch break isn't long enough.
[edit] I guess Lehm beat me to it. Well, maybe that info will help a little.
The OGL wrapper already does that You wouldn't be able to get smooth bilinear filtering or true color lighting with a palette of 256 colors. The function that performs the conversion is ./arch/ogl/ogl.c ogl_filltexbuf(), and it's only called by ogl_loadtexture() in the same file (the first instance is in a #if 0 block which might as well be taken out). The trick is figuring out which textures need to have the additional processing done on them. You know a lot more about how textures are loaded, so maybe you could create a path of bools along the call tree to say whether or not to put in the alpha information. If you want to at least see what it looks like, have it exclude 64x64 textures so you can get a fair idea. It does look really nice. From what you're saying, I guess you're not messing around much in the OGL side of things. It's probably not worth worrying about putting it in now, though, since Lehm is rewriting the whole mess anyway (or is he? I might've misinterpreted something). Hopefully his version will make it easier to implement. *wink wink nudge nudge*
"Phoenix blobs were animated in D2?"
There's three frames of Phoenix blob in DTX2, I see no reason why they shouldn't be animated, even if they aren't in the original D2
eh. My hour long lunch break isn't long enough.
[edit] I guess Lehm beat me to it. Well, maybe that info will help a little.
patches,
Re transparency:
thanks for the hint. I was looking for a function like that, but not thoroughly enough. I have modified D2X so that it can have a texture's pixels' alpha calculated from the pixel colors, but the result are rather opaque explosions, so I am still experimenting with this.
Re Phoenix blob anims:
There was a bug in D2X causing not all of the animation's frames to be shown. I have fixed that bug.
Re transparency:
thanks for the hint. I was looking for a function like that, but not thoroughly enough. I have modified D2X so that it can have a texture's pixels' alpha calculated from the pixel colors, but the result are rather opaque explosions, so I am still experimenting with this.
Re Phoenix blob anims:
There was a bug in D2X causing not all of the animation's frames to be shown. I have fixed that bug.
Cockpit
edit:whoops, small bug, when you enter first level in D2 a sound is played caused by a purple "bolt"...which is no longer rendered.
First movieframe prob is solved, cockpit is there, transparency effects look ok...
The viewangle should be altered when using the full cockpit. I noticed this when starting up the good old dosdescent2. It's quite hard to play at this viewangle. Maybe the rearview interface should be implemented too, when cockpit is active, don't forget about the viewangle here too.
Not that it is useful, but it would be nice to have ALL functionality from the original descent II plus a whole lot of your extra's...just for the feel of it. Needless to say, you are almost there.
I don't see any exagerated transparency effects:) Possibly you have released a version (with the gradients etc) in between I have not used...or the transparency is different on different hardware...
Anyyay looks good to me.
About the sound problem...I'm using a SoundMax chip onboard. When I'm working on my laptop, I sometimes have noise going trough my speakers for no apperent reason, and I must enable "a primary buffer for older soundcards" in winamp.
So I think it's an issue with sdl versus crappy soundcards Maybe you could implement a primary buffer of some sort (I -kind of- know what it is:lol:) 11Khz works ok though.
But off course, I think that fixing bugs has a higher priority.
another edit: missile explosions have no (or too slight) tranparency.
First movieframe prob is solved, cockpit is there, transparency effects look ok...
The viewangle should be altered when using the full cockpit. I noticed this when starting up the good old dosdescent2. It's quite hard to play at this viewangle. Maybe the rearview interface should be implemented too, when cockpit is active, don't forget about the viewangle here too.
Not that it is useful, but it would be nice to have ALL functionality from the original descent II plus a whole lot of your extra's...just for the feel of it. Needless to say, you are almost there.
I don't see any exagerated transparency effects:) Possibly you have released a version (with the gradients etc) in between I have not used...or the transparency is different on different hardware...
Anyyay looks good to me.
About the sound problem...I'm using a SoundMax chip onboard. When I'm working on my laptop, I sometimes have noise going trough my speakers for no apperent reason, and I must enable "a primary buffer for older soundcards" in winamp.
So I think it's an issue with sdl versus crappy soundcards Maybe you could implement a primary buffer of some sort (I -kind of- know what it is:lol:) 11Khz works ok though.
But off course, I think that fixing bugs has a higher priority.
another edit: missile explosions have no (or too slight) tranparency.
Are you modifying the actual alpha of the image? I don't think that is necessary as it can do it when it renders.Diedel wrote:patches,
I have modified D2X so that it can have a texture's pixels' alpha calculated from the pixel colors, but the result are rather opaque explosions, so I am still experimenting with this.
purple bolt
see my previous post.
The purple bolt that flashes as you enter a level doesn't render when:
-the first level is being loaded (you start the game from the menu)
-you load a level from the menu using "enter a level to start from"
Otherwise the bolt IS rendered (when a level is being loaded after you have beaten the previous one)
Apperently something goes wrong when a game is loaded from the menu
The purple bolt that flashes as you enter a level doesn't render when:
-the first level is being loaded (you start the game from the menu)
-you load a level from the menu using "enter a level to start from"
Otherwise the bolt IS rendered (when a level is being loaded after you have beaten the previous one)
Apperently something goes wrong when a game is loaded from the menu
"Are you modifying the actual alpha of the image? I don't think that is necessary as it can do it when it renders."
I imagine he's doing it when the texture is loaded. Doing it at rendertime is only practical in a pixel shader, and not all cards can run pixel shaders. It doesn't matter where it gets done. If nothing else, putting it higher up in the render process makes it cleaner and more uniform, and easier to access D2's texture information (the name in particular).
"but the result are rather opaque explosions, so I am still experimenting with this."
It's supposed to be a little more opaque, but only at the bright parts. After all, how many real explosions can you completely see through? The idea behind it is that the dark parts should be considerably less visible, so the explosions look nicer but still bright and substantial. There's no such thing as a dull explosion. It depends on the blend mode used, as well. Maybe try glBlendFunc(GL_SRC_ALPHA, GL_ONE) (just make sure you set it back to the "normal" mode of (GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA) when you're done rendering the sprite), that might look okay, although mathematically it'd be pretty close to additive blending (GL_ONE,GL_ONE). You'd probably do better to try using channel weighting to vary the level of transparency acheived. Ideally, the explosion would be opaque enough in the center to hide whatever polymodel happens to be exploding until the game stops displaying it. It's okay to scale and bias the transparency, it's not a matter of accuracy so much as getting something that looks good. Mind posting a screenie of what you've got so far?
I imagine he's doing it when the texture is loaded. Doing it at rendertime is only practical in a pixel shader, and not all cards can run pixel shaders. It doesn't matter where it gets done. If nothing else, putting it higher up in the render process makes it cleaner and more uniform, and easier to access D2's texture information (the name in particular).
"but the result are rather opaque explosions, so I am still experimenting with this."
It's supposed to be a little more opaque, but only at the bright parts. After all, how many real explosions can you completely see through? The idea behind it is that the dark parts should be considerably less visible, so the explosions look nicer but still bright and substantial. There's no such thing as a dull explosion. It depends on the blend mode used, as well. Maybe try glBlendFunc(GL_SRC_ALPHA, GL_ONE) (just make sure you set it back to the "normal" mode of (GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA) when you're done rendering the sprite), that might look okay, although mathematically it'd be pretty close to additive blending (GL_ONE,GL_ONE). You'd probably do better to try using channel weighting to vary the level of transparency acheived. Ideally, the explosion would be opaque enough in the center to hide whatever polymodel happens to be exploding until the game stops displaying it. It's okay to scale and bias the transparency, it's not a matter of accuracy so much as getting something that looks good. Mind posting a screenie of what you've got so far?
hah! I found an old binary of my uber-hacked D1X, last modified in August 2002. The uber-hacked nature of this binary makes it possible to see the difference between what I implemented before and the old style explosions. While the new pretty explosion is more technically advanced, the old style looks more like an explosion.
So, to improve on it, try the (GL_ONE,GL_ONE_MINUS_SRC_ALPHA) blend mode (in this mode, the explosion is always full bright, added to what is underneath multiplied by the inverse of the alpha. so the higher the alpha, the darker the stuff behind the explosion), and also see what effect square rooting the final alpha values has on their overall appearance.
So, to improve on it, try the (GL_ONE,GL_ONE_MINUS_SRC_ALPHA) blend mode (in this mode, the explosion is always full bright, added to what is underneath multiplied by the inverse of the alpha. so the higher the alpha, the darker the stuff behind the explosion), and also see what effect square rooting the final alpha values has on their overall appearance.
Ok, just tested the latest D2X. Movies look good ^^ The explosions look like what I had in D1X way back when (well, at least the blending). Several explosions in D2 level 1 weren't transparent at all (lava, Spreadfire, some others too I would imagine), though. And then I noticed a weird artifact with the Spreadfire: the last frame appeared before the first one for a frame. So I recorded a short demo, and found that the Spreadfire's explosions on the wall were blending just fine, and that the artifact was barely visible because the last frame is so transparent. The last frame of the explosion appeared before the blobs actually hit the wall (might be related to demo playback strangeness, but it just might be related to the blending problems, given the nature of the code). So, I guess something really screwy is going on with blending >_< OGL state is managed very poorly in D1X/D2X Try adding an explicit glEnable(GL_BLEND) statement before rendering the blended bitmaps and see if that changes anything. Something is really fubar either way though.
patches,
the 1.3.45 contains a fixed explosion alpha calculation code. Texture pixel alpha is indeed computed when loading the texture, so it only happens once (or, if the texture gets flushed from the texture cache, when it is needed again - should happen rarely though, if at all).
The animation rendering stuff is still somehow problematic. The code determining which frame to render was definitely buggy, so I had replaced it with some different code. When I tested that code, it did at least render every frame of an animation during its lifetime.
Basically, there's two types of animated (non-wall) textures: Those that play each of their frame once during their life time (explosions or the appearance effect flash - which btw is not purple), and those that cycle through their frames and start over until their lifetime has expired (like e.g. Phoenix blobs).
The code initially used to determine the frame to play was like
This did not render all frames, so I changed it to
The reason I am computing the modulus of <time to live left> by <total life time> before dividing by <frame time> is that <total life time> basically is the total time an animation needs to cycle through all frames once (<frame count> * <frame time>), while <time to live left> can be a multiple of that (like in the case of Phoenix blobs). So first I need to find out how much time of a single animation cycle has passed before determining which animation frame it should play.
the 1.3.45 contains a fixed explosion alpha calculation code. Texture pixel alpha is indeed computed when loading the texture, so it only happens once (or, if the texture gets flushed from the texture cache, when it is needed again - should happen rarely though, if at all).
The animation rendering stuff is still somehow problematic. The code determining which frame to render was definitely buggy, so I had replaced it with some different code. When I tested that code, it did at least render every frame of an animation during its lifetime.
Basically, there's two types of animated (non-wall) textures: Those that play each of their frame once during their life time (explosions or the appearance effect flash - which btw is not purple), and those that cycle through their frames and start over until their lifetime has expired (like e.g. Phoenix blobs).
The code initially used to determine the frame to play was like
Code: Select all
<frame #> = <frame count> - ((<frame count - 1) * <time to live left> / <total life time>) - 1
Code: Select all
<frame #> = (<total life time> - <time to live left> % <total life time>) / <frame time>
Yes, you should. I forgot to mention that the square roots I mentioned trying should be performed on 0..1 values. I'm used to floating point color values and assumed you would get that. doh! Square rooting a value from 0..1 makes it higher, which would make GL_ONE,GL_ONE_MINUS_SRC_ALPHA less transparent.
bolt is ok now.
cockpit not quite yet...the viewangle is not exactly like in the original game (screenshots between dos versus d2x cockpit view differ when i load the first level) and the crosshair doesn't change "height" (on the screen, dunno how this is processed in 3D...it's not just shifted up,(lasers are shooted from this newly positioned crosshair) when switching between on/off cockpit. Isn't all this stuff in the source so it could be copy-pasted, more or less?
Since you care more about implementing new features I wouldn't mind if you put this cockpit stuff at the bottom of your priority list...even for me it can wait.
cockpit not quite yet...the viewangle is not exactly like in the original game (screenshots between dos versus d2x cockpit view differ when i load the first level) and the crosshair doesn't change "height" (on the screen, dunno how this is processed in 3D...it's not just shifted up,(lasers are shooted from this newly positioned crosshair) when switching between on/off cockpit. Isn't all this stuff in the source so it could be copy-pasted, more or less?
Since you care more about implementing new features I wouldn't mind if you put this cockpit stuff at the bottom of your priority list...even for me it can wait.
You know, this is very, very much a pipedream. And it's not something you usually use Descent 2 for anyway.
But what would you say to improving the scripting engine for D2X-W32 only levels? Say, adding more effects and allowing the user to specify parameters such as... hm, how long a door is open, how many robots a generator should produce, or how many times a trigger should work?
Perhaps really advanced stuff could modify player ship attributes, such as shields and energy, giving players items (a cloaking station could be kind of fun), or even the scoreboard - such as giving bonus points for achieving mission goals.
I guess it might also be useful to have 'cancelling' triggers for robot generators... so that robots are produced constantly as long as a player is in a room, but only if that is the case... could make reactor rooms a bit nastier if so we choose. Ideally this would sense whether any player is in certain cubes, but that would require a big cube list... although... actually, it could be done.
Another helpful thing, if you wanted to go down that path, is triggers that do more than one thing. Or walls that have more than one trigger, either way.
But yeah, it's a lot of work. Would be fun though. Maybe one day.
But what would you say to improving the scripting engine for D2X-W32 only levels? Say, adding more effects and allowing the user to specify parameters such as... hm, how long a door is open, how many robots a generator should produce, or how many times a trigger should work?
Perhaps really advanced stuff could modify player ship attributes, such as shields and energy, giving players items (a cloaking station could be kind of fun), or even the scoreboard - such as giving bonus points for achieving mission goals.
I guess it might also be useful to have 'cancelling' triggers for robot generators... so that robots are produced constantly as long as a player is in a room, but only if that is the case... could make reactor rooms a bit nastier if so we choose. Ideally this would sense whether any player is in certain cubes, but that would require a big cube list... although... actually, it could be done.
Another helpful thing, if you wanted to go down that path, is triggers that do more than one thing. Or walls that have more than one trigger, either way.
But yeah, it's a lot of work. Would be fun though. Maybe one day.
Sirius,
Yeah, maybe one day ... I'd like to see D2X work at least on Linux (and maybe even on Mac OS X), and see ppl actually use the new features (-> Entropy, Enhanced CTF, UDP/IP connections, tracker ...) first though.
Lehm,
I had to fix some additional stuff, so I pulled the files back for a while. Everything is done now and the files are available.
Yeah, maybe one day ... I'd like to see D2X work at least on Linux (and maybe even on Mac OS X), and see ppl actually use the new features (-> Entropy, Enhanced CTF, UDP/IP connections, tracker ...) first though.
Lehm,
I had to fix some additional stuff, so I pulled the files back for a while. Everything is done now and the files are available.
-
- DBB Cadet
- Posts: 14
- Joined: Sun Jun 06, 2004 9:59 pm
Re: Mine doesn't even work.
Windows XP Professional, ATI Radeon 9800 Pro, latest Omega Drivers. ASUS A7N8X Deluxe with onboard sound. AMD Athlon XP 2000+. 1GB PC3200 RAM (UNDERCLOCKED by the 2000+ being so slow in FSB, I might add --- so its exceptionally stable). Heck, it did this on a different video card too. It's not my configuration, and no viruses have infected NTDLL.DLL.Diedel wrote:Giving some details like OS, gfx & sound hardware, driver versions would have been a good idea.Gregster2k wrote:Every version of D2X-W32 I've ever used has Crashed to Desktop on startup with an NTDLL.DLL error message.
I have D2X-W32 running on WinXP pro & home, Win2K pro with Radeon 9800 pro, X800 XT, GF 5200, GF2 400 MX, Soundblaster onboard, Soundblaster Audigy w/o problems.
Gregster,
if NTDLL crashes, it might be some driver-related problem, maybe even rooted in SDL.dll.
Make sure you have the lastest chipset drivers for your motherboard installed.
Did you try a clean install of Descent 2, and then putting D2X-W32 there? You don't need to delete your current installation - just chose another folder.
If that doesn't help, try to edit the d2x.ini file in the D2 folder and put a semicolon in front of the -fullscreen parameter.
You might also want to try the Catalyst drivers.
As a last resort, right click on d2x-w32.exe in the program manager and chose Windows 95 compatibility mode for it - maybe that helps.
Oh yes - and don't try to run D2X-W32 when the graphical folding@home client (even minimized to tray) or RivaTuner are running.
Diedel
if NTDLL crashes, it might be some driver-related problem, maybe even rooted in SDL.dll.
Make sure you have the lastest chipset drivers for your motherboard installed.
Did you try a clean install of Descent 2, and then putting D2X-W32 there? You don't need to delete your current installation - just chose another folder.
If that doesn't help, try to edit the d2x.ini file in the D2 folder and put a semicolon in front of the -fullscreen parameter.
You might also want to try the Catalyst drivers.
As a last resort, right click on d2x-w32.exe in the program manager and chose Windows 95 compatibility mode for it - maybe that helps.
Oh yes - and don't try to run D2X-W32 when the graphical folding@home client (even minimized to tray) or RivaTuner are running.
Diedel
-
- DBB Cadet
- Posts: 14
- Joined: Sun Jun 06, 2004 9:59 pm
-
- DBB Cadet
- Posts: 14
- Joined: Sun Jun 06, 2004 9:59 pm
2.0 is AFAIK the last revision of the ASUS A7N8X-D ever created before they made the later super-models of the Deluxe series. I don't see how a motherboard chipset would affect NTDLL.DLL.
It would appear to be Windows XP specific (check your code?) as it works under compatibility mode for Windows 95, 98/SE/ME, and 2000.
Windows XP standard does not start in Windowed mode, full screen, all settings remmed out with semicolons or default settings on. No situation will get it to work.
It would appear to be Windows XP specific (check your code?) as it works under compatibility mode for Windows 95, 98/SE/ME, and 2000.
Windows XP standard does not start in Windowed mode, full screen, all settings remmed out with semicolons or default settings on. No situation will get it to work.
Gregster,
I am trying to exclude every possible reason for the problem.
To be honest: I do believe the problem has to do with your config. I never had any such problems on all of my machines, and I maintain them very well.
So no chance with 'checking' my code. All I could do in a case like yours is
a) Put a development environment on the PC causing problems and debug there.
b) Create a lot of debug output to some file and exchange program debug versions and log files with the user of the PC causing problems.
Before I am going to do that, I will make sure the PC is in proper shape. Installing latest drivers for all system components and trying the Catalyst drivers is your part in the process.
Diedel
I am trying to exclude every possible reason for the problem.
To be honest: I do believe the problem has to do with your config. I never had any such problems on all of my machines, and I maintain them very well.
So no chance with 'checking' my code. All I could do in a case like yours is
a) Put a development environment on the PC causing problems and debug there.
b) Create a lot of debug output to some file and exchange program debug versions and log files with the user of the PC causing problems.
Before I am going to do that, I will make sure the PC is in proper shape. Installing latest drivers for all system components and trying the Catalyst drivers is your part in the process.
Diedel
- Sapphire Wolf
- DBB Admiral
- Posts: 1463
- Joined: Mon Nov 24, 2003 3:01 am
- Location: Nope.avi , gender: male
- Contact: