A texture rendering system relying on pseudo-haptic and audio feedback is presented in this paper. While the user touches the texture displayed on a tactile screen, the associated image is deformed according to the contact area and the rubbing motion to simulate pressure. Additionally audio feedback is synthesized in real-time to simulate friction. A novel example-based scheme takes advantage of recorded audio samples of friction between actual textures and a finger at several speeds to synthesize the final output sound. This system can be implemented on any existing tactile screen without any extra mechanical device.