Find the answer to your Linux question:
Page 2 of 3 FirstFirst 1 2 3 LastLast
Results 11 to 20 of 23
Then how would you explain that this AIGLX wiki says it doesn't work with fglrx? I'll give you more details soon....
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #11
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205

    Then how would you explain that this AIGLX wiki says it doesn't work with fglrx?

    I'll give you more details soon.
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

  2. #12
    Linux Guru antidrugue's Avatar
    Join Date
    Oct 2005
    Location
    Montreal, Canada
    Posts
    3,211
    Quote Originally Posted by Pythagoras
    Then how would you explain that this AIGLX wiki says it doesn't work with fglrx?
    Honestly I have no idea. I am sincerely puzzled as why they would state such a thing.

    I can assure you : I am not a magician. I got it to work on the first try, and I am not even a regular Fedora Core user.

    With the FGLRX drivers, I got AIGLX + Compiz to work with almost no efforts on Debian Etch, Debian Sid, Mandriva 2007, Fedora Core 6, Ubuntu Edgy and the likes.

    Why some people say it is impossible ? Unfortunately I cannot answer that.

    Try Mandriva One 2007 (Gnome or KDE edition) and you will see. Just enable the 3D desktop : it works right away (even on the LiveCD) ! No configuration needed !
    "To express yourself in freedom, you must die to everything of yesterday. From the 'old', you derive security; from the 'new', you gain the flow."

    -Bruce Lee

  3. #13
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205
    OK, I will try once more if it work with your card.
    My original xorg.conf was:
    Code:
    # Xorg configuration created by pyxf86config but edited by me
    
    Section "ServerLayout"
            Identifier     "Default Layout"
            Screen      0  "aticonfig-Screen[0]" 0 0
            InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    
    Section "Files"
            FontPath     "unix/:7100"
    EndSection
    
    Section "Module"
            Load  "glx"
            Load  "dri"
            Load  "extmod"
    EndSection
    
    Section "ServerFlags"
            Option      "AIGLX" "off"
    EndSection
    
    Section "InputDevice"
            Identifier  "Keyboard0"
            Driver      "kbd"
            Option      "XkbModel" "pc105"
            Option      "XkbLayout" "us,ru"
            Option      "XkbOptions" "grp:alt_shift_toggle"
            Option      "XkbVariant" "winkeys"
    EndSection
    
    Section "Monitor"
            Identifier   "aticonfig-Monitor[0]"
     ### Comment all HorizSync and VertSync values to use DDC:
            HorizSync    63.9
            VertRefresh  60
            Option      "VendorName" "ATI Proprietary Driver"
            Option      "ModelName" "Prestigio P171"
            Option      "DPMS" "true"
    EndSection
    
    Section "Device"
            Identifier  "aticonfig-Device[0]"
            Driver      "fglrx"
            BoardName   "ATI Radeon X1300 Series"
            Option      "TVFormat" "PAL-B"
            Option      "TVOverscan" "on"
            Option      "UseInternalAGPGART" "no"
            Option      "VideoOverlay" "on"
    EndSection
    
    Section "Screen"
            Identifier "aticonfig-Screen[0]"
            Device     "aticonfig-Device[0]"
            Monitor    "aticonfig-Monitor[0]"
            DefaultDepth     24
            SubSection "Display"
                    Viewport   0 0
                    Depth     24
                    Modes    "1280x1024" "1024x768" "800x600" "640x480"
            EndSubSection
    EndSection
    
    Section "Extensions"
            Option      "Composite" "disabled"
    EndSection
    If I start Xorg with this configuration I would get direct rendering:
    Code:
    # glxinfo | grep direct
    direct rendering: Yes
    If I change AIGLX to "on" in xorg.conf I would get the following in /var/log/Xorg.0.log:
    Code:
    (EE) fglrx(0): Failed to initialize UMM driver.
    (II) fglrx(0): [drm] removed 1 reserved context for kernel
    (II) fglrx(0): [drm] unmapping 8192 bytes of SAREA 0x23000 at 0xb7f03000
    (WW) fglrx(0): ***********************************************
    (WW) fglrx(0): * DRI initialization failed!                  *
    (WW) fglrx(0): * (maybe driver kernel module missing or bad) *
    (WW) fglrx(0): * 2D acceleraton available (MMIO)             *
    (WW) fglrx(0): * no 3D acceleration available                *
    (WW) fglrx(0): ********************************************* *
    And would lose DRI:
    Code:
    # glxinfo | grep direct
    direct rendering: No
    OpenGL renderer string: Mesa GLX Indirect
    And no AIGLX:
    Code:
    # cat /var/log/Xorg.0.log|grep AIGLX
    (**) Option "AIGLX" "on"
    (**) AIGLX enabled
    (EE) AIGLX: Screen 0 is not DRI capable
    What am I doing wrong?
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

  4. #14
    Linux Guru antidrugue's Avatar
    Join Date
    Oct 2005
    Location
    Montreal, Canada
    Posts
    3,211
    DRI won't work with that configuration. You are forgetting a few things it seem.

    No guarantees, but I would try something like that :
    Code:
    # Xorg configuration created by pyxf86config but edited by me
    
    Section "ServerLayout"
            Identifier     "Default Layout"
            Screen      0  "aticonfig-Screen[0]" 0 0
            InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    
    Section "Files"
            FontPath     "unix/:7100"
    EndSection
    
    Section "Module"
            Load  "dbe"
            Load  "extmod"
            Load  "glx"
            Load  "GLcore"
            Load  "dri"
            # Load "extmod" but omit DGA extension
    	# (the DGA extension is broken in the fglrx driver)
    	SubSection "extmod"
    		Option "omit xfree86-dga"
    	EndSubSection
    EndSection
    
    Section "InputDevice"
            Identifier  "Keyboard0"
            Driver      "kbd"
            Option      "XkbModel" "pc105"
            Option      "XkbLayout" "us,ru"
            Option      "XkbOptions" "grp:alt_shift_toggle"
            Option      "XkbVariant" "winkeys"
    EndSection
    
    Section "Monitor"
            Identifier   "aticonfig-Monitor[0]"
     ### Comment all HorizSync and VertSync values to use DDC:
            HorizSync    63.9  #I would double check those
            VertRefresh  60    # double check as well
            Option      "VendorName" "ATI Proprietary Driver"
            Option      "ModelName" "Prestigio P171"
            Option      "DPMS" "true"
    EndSection
    
    Section "Device"
            Identifier  "aticonfig-Device[0]"
            Driver      "fglrx"
            BoardName   "ATI Radeon X1300 Series"
            Option      "TVFormat" "PAL-B"
            Option      "TVOverscan" "on"
            Option      "UseInternalAGPGART" "no"
            Option      "VideoOverlay"      "on"
            Option	    "VBERestore"	"true"
    	Option      "backingstore"	"true"
    	Option      "RenderAccel"	"true"
            Option      "OpenGLOverlay"   "off"
            Option      "XAANoOffscreenPixmaps"	"true"
    EndSection
    
    Section "Screen"
            Identifier "aticonfig-Screen[0]"
            Device     "aticonfig-Device[0]"
            Monitor    "aticonfig-Monitor[0]"
            DefaultDepth     16
            SubSection "Display"
                    Viewport   0 0
                    Depth     16
                    Modes    "1280x1024" "1024x768" "800x600" "640x480"
            EndSubSection
    EndSection
    
    Section "Extensions"
            Option    "Composite" "Enable"
    EndSection
    
    Section "DRI"
    	Mode	0666
    EndSection
    It should work like that.

    Did you look at the AIGLX tutorial I linked ? You were missing quite a few things mentioned in it (namely the "dbe" module). And of course, AIGLX will never work if the composite extension is disabled.

    Also, I would double check the values for your monitor refresh rate, they look very suspicious. Plus, I don't know about those TV related settings, never tried them.
    "To express yourself in freedom, you must die to everything of yesterday. From the 'old', you derive security; from the 'new', you gain the flow."

    -Bruce Lee

  5. #15
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205
    DRI works with that configuration if both AIGLX and Composite extension disabled.
    I thought I should have read only AIGLX part of the tutorial. And I thought Xorg 7.1 could dynamically load modules.
    Monitor refresh rates are OK (it's an LCD one). All TV related settings are set by aticonfig script and are OK as well for tvout works.
    Now about Composite extension. If I enable it (regardless of AIGLX) I would get the following in /var/log/Xorg.0.log (tested with configuration from your post):
    Code:
    (II) fglrx(0): Composite extension enabled, disabling direct rendering
    (WW) fglrx(0): ***********************************************
    (WW) fglrx(0): * DRI initialization failed!                  *
    (WW) fglrx(0): * (maybe driver kernel module missing or bad) *
    (WW) fglrx(0): * 2D acceleraton available (MMIO)             *
    (WW) fglrx(0): * no 3D acceleration available                *
    (WW) fglrx(0): ********************************************* *
    I've also got this:
    Code:
    (WW) fglrx(0): Option "VBERestore" is not used
    (WW) fglrx(0): Option "RenderAccel" is not used
    (EE) AIGLX: Screen 0 is not DRI capable
    Why do you recommend changing DefaultDepth to 16?
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

  6. #16
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205
    Do you have any ideas?

    This is my current xorg.conf file. Have I forgotten anything?
    Code:
    # Xorg configuration created by pyxf86config but edited by me
    
    Section "ServerLayout"
            Identifier     "Default Layout"
            Screen      0  "aticonfig-Screen[0]" 0 0
            InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    
    Section "Files"
            FontPath     "unix/:7100"
    EndSection
    
    Section "Module"
            Load  "glx"
            Load  "dri"
            Load  "dbe"
            Load  "extmod"
            Load  "GLcore"
            SubSection "extmod"
                    Option "omit xfree86-dga"
            EndSubSection
    EndSection
    
    Section "ServerFlags"
            Option      "AIGLX" "on"
    EndSection
    
    Section "InputDevice"
            Identifier  "Keyboard0"
            Driver      "kbd"
            Option      "XkbModel" "pc105"
            Option      "XkbLayout" "us,ru"
            Option      "XkbOptions" "grp:alt_shift_toggle"
            Option      "XkbVariant" "winkeys"
    EndSection
    
    Section "Monitor"
            Identifier   "aticonfig-Monitor[0]"
            HorizSync    63.9
            VertRefresh  60
            Option      "VendorName" "ATI Proprietary Driver"
            Option      "ModelName" "Prestigio P171"
            Option      "DPMS" "true"
    EndSection
    
    Section "Device"
            Identifier  "aticonfig-Device[0]"
            Driver      "fglrx"
            BoardName   "ATI Radeon X1300 Series"
            Option      "TVFormat" "PAL-B"
            Option      "TVOverscan" "on"
            Option      "UseInternalAGPGART" "no"
            Option      "VideoOverlay" "on"
    # newly added
            Option      "VBERestore" "true"
            Option      "backingstore" "true"
            Option      "RenderAccel" "true"
            Option      "OpenGLOverlay" "off"
            Option      "XAANoOffscreenPixmaps" "true"
    EndSection
    
    Section "Screen"
            Identifier "aticonfig-Screen[0]"
            Device     "aticonfig-Device[0]"
            Monitor    "aticonfig-Monitor[0]"
            DefaultDepth     24
            SubSection "Display"
                    Viewport   0 0
                    Depth     24
                    Modes    "1280x1024" "1024x768" "800x600" "640x480"
            EndSubSection
    EndSection
    
    Section "Extensions"
            Option      "Composite" "Enable"
    EndSection
    
    Section "DRI"
            Mode    0666
    EndSection
    Neither DRI nor AIGLX works...
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

  7. #17
    Linux Guru antidrugue's Avatar
    Join Date
    Oct 2005
    Location
    Montreal, Canada
    Posts
    3,211
    Quote Originally Posted by Pythagoras
    Why do you recommend changing DefaultDepth to 16?
    Simply because it is about 50% faster then 24 bit color depth. You can verify that with :
    Code:
    glxgears
    Quote Originally Posted by Pythagoras
    Do you have any ideas?
    Well, I can't say I've never had issue with ATI drivers myself. Nothing a quick reinstallation of the drivers couldn't solve though. Now that I gave away that ATI 9550 for an NVIDIA 7600GT, I'm glad I don't have to look back.

    As I am not a Fedora user myself, my suggestions are limited.

    Your /etc/X11/xorg.conf looks good enough. It certainly doesn't look like the one I had with Fedora Core 6 (unfortunately I don't have it anymore), which was really streamlined (very much like the default one).

    I got it to work in FC6 very simply : a clean installation, configuration of xorg.conf to support AIGLX, and click on "enable 3d desktop". The whole process worked just "as advertised".

    If you really want to get AIGLX+Compiz (Beryl is much better though), I suggest you try another distro.
    "To express yourself in freedom, you must die to everything of yesterday. From the 'old', you derive security; from the 'new', you gain the flow."

    -Bruce Lee

  8. #18
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205
    Well, I don't think there could be anything distro-specific. And when fglrx says "(II) fglrx(0): Composite extension enabled, disabling direct rendering" it doesn't look like an error but like an unsupported feature.
    I'll wait for driver updates.
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

  9. #19
    Linux Guru antidrugue's Avatar
    Join Date
    Oct 2005
    Location
    Montreal, Canada
    Posts
    3,211
    Quote Originally Posted by Pythagoras
    Well, I don't think there could be anything distro-specific. And when fglrx says "(II) fglrx(0): Composite extension enabled, disabling direct rendering" it doesn't look like an error but like an unsupported feature.
    I'll wait for driver updates.
    Hum, ok.

    The Gentoo folks say it works though :
    http://gentoo-wiki.com/HOWTO_AIGLX

    Debian folks say it too :
    http://wiki.debian.org/Compiz

    I say it too (well, I got it to work, in numerous distro as I told you, and I am not lying).

    And if you look around, many people got it to work :
    Quote Originally Posted by JDillio12
    However, Xorg 7.1 is defaulted to AIGLX, which some said wouldn't work with ATI cards. Well, it does, and I now have Compiz + Direct 3D rendering enabled.
    from here.
    "To express yourself in freedom, you must die to everything of yesterday. From the 'old', you derive security; from the 'new', you gain the flow."

    -Bruce Lee

  10. #20
    Linux Newbie Pythagoras's Avatar
    Join Date
    Aug 2006
    Location
    Moscow, Russia
    Posts
    205
    The howto from your first link says fglrx is not supported:
    Cards Not Supported
    • ATI: Rage 128. - Driver locking issue.
    • ATI: Mach64. - No DRM support in Fedora, still insecure.
    • ATI: Any closed source driver. - Uses incompatible DRI API.
    • ATI Radeon Xpress 200M. - Memory issues.
    • ...
    The second link says it too:
    First, you should verify if your video card is supported. Check the list of supported cards below (on other cards, Compiz would need Xgl):
    • Intel i830 to i945 graphic cards
    • ATI Radeon cards up to X800 series
    • NVIDIA graphic cards are supported in the 9xxx-series proprietary driver
    Radeon cards up to X800 series are supported by open drivers. But mine is X1300. And in NVIDIA 9xxx-series proprietary driver, the GLX_EXT_texture_from_pixmap first appeared.
    At the far end I'm not sure JDillio12 uses fglrx. He says he uses ati drivers installed by default:
    Quote Originally Posted by JDillio12
    I switched over to Debian Testing (Netinstall Etch) that comes with Xorg 7.1 and the ATI drivers already installed.
    I know Debian Testing includes fglrx but I'm not sure it installs fglrx by default.
    I believe that you are not lying, and I'm in perplexity...
    2b|!2b, that is the question

    If substraction add to multiplication you will get a division.

    GDB has a 'break' feature. Why doesn't it have a 'fix' too?

    Registered Linux User #437662

Page 2 of 3 FirstFirst 1 2 3 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •