Java Code Examples for javax.sound.sampled.AudioSystem#NOT_SPECIFIED
The following examples show how to use
javax.sound.sampled.AudioSystem#NOT_SPECIFIED .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: AiffFileWriter.java From jdk8u-jdk with GNU General Public License v2.0 | 6 votes |
public int write(AudioInputStream stream, AudioFileFormat.Type fileType, OutputStream out) throws IOException { //$$fb the following check must come first ! Otherwise // the next frame length check may throw an IOException and // interrupt iterating File Writers. (see bug 4351296) // throws IllegalArgumentException if not supported AiffFileFormat aiffFileFormat = (AiffFileFormat)getAudioFileFormat(fileType, stream); // we must know the total data length to calculate the file length if( stream.getFrameLength() == AudioSystem.NOT_SPECIFIED ) { throw new IOException("stream length not specified"); } int bytesWritten = writeAiffFile(stream, aiffFileFormat, out); return bytesWritten; }
Example 2
Source File: MpegAudioFileReader.java From jclic with GNU General Public License v2.0 | 6 votes |
/** * Returns AudioFileFormat from URL. */ @Override public AudioFileFormat getAudioFileFormat(URL url) throws UnsupportedAudioFileException, IOException { if (TDebug.TraceAudioFileReader) {TDebug.out("MpegAudioFileReader.getAudioFileFormat(URL): begin"); } long lFileLengthInBytes = AudioSystem.NOT_SPECIFIED; URLConnection conn = url.openConnection(); // Tell shoucast server (if any) that SPI support shoutcast stream. conn.setRequestProperty ("Icy-Metadata", "1"); InputStream inputStream = conn.getInputStream(); AudioFileFormat audioFileFormat = null; try { audioFileFormat = getAudioFileFormat(inputStream, lFileLengthInBytes); } finally { inputStream.close(); } if (TDebug.TraceAudioFileReader) {TDebug.out("MpegAudioFileReader.getAudioFileFormat(URL): end"); } return audioFileFormat; }
Example 3
Source File: AiffFileWriter.java From jdk8u60 with GNU General Public License v2.0 | 6 votes |
public int write(AudioInputStream stream, AudioFileFormat.Type fileType, OutputStream out) throws IOException { //$$fb the following check must come first ! Otherwise // the next frame length check may throw an IOException and // interrupt iterating File Writers. (see bug 4351296) // throws IllegalArgumentException if not supported AiffFileFormat aiffFileFormat = (AiffFileFormat)getAudioFileFormat(fileType, stream); // we must know the total data length to calculate the file length if( stream.getFrameLength() == AudioSystem.NOT_SPECIFIED ) { throw new IOException("stream length not specified"); } int bytesWritten = writeAiffFile(stream, aiffFileFormat, out); return bytesWritten; }
Example 4
Source File: WaveFileWriter.java From TencentKona-8 with GNU General Public License v2.0 | 6 votes |
public int write(AudioInputStream stream, AudioFileFormat.Type fileType, OutputStream out) throws IOException { //$$fb the following check must come first ! Otherwise // the next frame length check may throw an IOException and // interrupt iterating File Writers. (see bug 4351296) // throws IllegalArgumentException if not supported WaveFileFormat waveFileFormat = (WaveFileFormat)getAudioFileFormat(fileType, stream); //$$fb when we got this far, we are committed to write this file // we must know the total data length to calculate the file length if( stream.getFrameLength() == AudioSystem.NOT_SPECIFIED ) { throw new IOException("stream length not specified"); } int bytesWritten = writeWaveFile(stream, waveFileFormat, out); return bytesWritten; }
Example 5
Source File: AlawEncoderSync.java From openjdk-jdk8u-backup with GNU General Public License v2.0 | 6 votes |
@Override public void run() { log("ConversionThread[" + num + "] started."); try { InputStream inStream = new ByteArrayInputStream(pcmBuffer); AudioInputStream pcmStream = new AudioInputStream( inStream, pcmFormat, AudioSystem.NOT_SPECIFIED); AudioInputStream alawStream = AudioSystem.getAudioInputStream(alawFormat, pcmStream); ByteArrayOutputStream outStream = new ByteArrayOutputStream(); int read = 0; byte[] data = new byte[4096]; while((read = alawStream.read(data)) != -1) { outStream.write(data, 0, read); } alawStream.close(); resultArray = outStream.toByteArray(); } catch (Exception ex) { log("ConversionThread[" + num + "] exception:"); log(ex); } log("ConversionThread[" + num + "] completed."); }
Example 6
Source File: WaveFileWriter.java From openjdk-jdk9 with GNU General Public License v2.0 | 5 votes |
@Override public int write(AudioInputStream stream, AudioFileFormat.Type fileType, File out) throws IOException { Objects.requireNonNull(stream); Objects.requireNonNull(fileType); Objects.requireNonNull(out); // throws IllegalArgumentException if not supported WaveFileFormat waveFileFormat = (WaveFileFormat)getAudioFileFormat(fileType, stream); // first write the file without worrying about length fields FileOutputStream fos = new FileOutputStream( out ); // throws IOException BufferedOutputStream bos = new BufferedOutputStream( fos, bisBufferSize ); int bytesWritten = writeWaveFile(stream, waveFileFormat, bos ); bos.close(); // now, if length fields were not specified, calculate them, // open as a random access file, write the appropriate fields, // close again.... if( waveFileFormat.getByteLength()== AudioSystem.NOT_SPECIFIED ) { int dataLength=bytesWritten-waveFileFormat.getHeaderSize(); int riffLength=dataLength + waveFileFormat.getHeaderSize() - 8; RandomAccessFile raf=new RandomAccessFile(out, "rw"); // skip RIFF magic raf.skipBytes(4); raf.writeInt(big2little( riffLength )); // skip WAVE magic, fmt_ magic, fmt_ length, fmt_ chunk, data magic raf.skipBytes(4+4+4+WaveFileFormat.getFmtChunkSize(waveFileFormat.getWaveType())+4); raf.writeInt(big2little( dataLength )); // that's all raf.close(); } return bytesWritten; }
Example 7
Source File: AbstractDataLine.java From openjdk-8-source with GNU General Public License v2.0 | 5 votes |
public final long getMicrosecondPosition() { long microseconds = getLongFramePosition(); if (microseconds != AudioSystem.NOT_SPECIFIED) { microseconds = Toolkit.frames2micros(getFormat(), microseconds); } return microseconds; }
Example 8
Source File: AudioFloatInputStream.java From jdk8u60 with GNU General Public License v2.0 | 5 votes |
public static AudioFloatInputStream getInputStream(AudioFormat format, byte[] buffer, int offset, int len) { AudioFloatConverter converter = AudioFloatConverter .getConverter(format); if (converter != null) return new BytaArrayAudioFloatInputStream(converter, buffer, offset, len); InputStream stream = new ByteArrayInputStream(buffer, offset, len); long aLen = format.getFrameSize() == AudioSystem.NOT_SPECIFIED ? AudioSystem.NOT_SPECIFIED : len / format.getFrameSize(); AudioInputStream astream = new AudioInputStream(stream, format, aLen); return getInputStream(astream); }
Example 9
Source File: SoftMixingMixer.java From openjdk-jdk8u-backup with GNU General Public License v2.0 | 5 votes |
public int getMaxLines(Line.Info info) { if (info.getLineClass() == SourceDataLine.class) return AudioSystem.NOT_SPECIFIED; if (info.getLineClass() == Clip.class) return AudioSystem.NOT_SPECIFIED; return 0; }
Example 10
Source File: AbstractDataLine.java From openjdk-8-source with GNU General Public License v2.0 | 4 votes |
/** * Constructs a new AbstractLine. */ protected AbstractDataLine(DataLine.Info info, AbstractMixer mixer, Control[] controls) { this(info, mixer, controls, null, AudioSystem.NOT_SPECIFIED); }
Example 11
Source File: DummySourceDataLine.java From jdk8u-dev-jdk with GNU General Public License v2.0 | 4 votes |
public float getLevel() { return AudioSystem.NOT_SPECIFIED; }
Example 12
Source File: UlawCodec.java From jdk8u_jdk with GNU General Public License v2.0 | 4 votes |
UlawCodecStream(AudioInputStream stream, AudioFormat outputFormat) { super(stream, outputFormat, AudioSystem.NOT_SPECIFIED); AudioFormat inputFormat = stream.getFormat(); // throw an IllegalArgumentException if not ok if (!(isConversionSupported(outputFormat, inputFormat))) { throw new IllegalArgumentException("Unsupported conversion: " + inputFormat.toString() + " to " + outputFormat.toString()); } //$$fb 2002-07-18: fix for 4714846: JavaSound ULAW (8-bit) encoder erroneously depends on endian-ness boolean PCMIsBigEndian; // determine whether we are encoding or decoding if (AudioFormat.Encoding.ULAW.equals(inputFormat.getEncoding())) { encode = false; encodeFormat = inputFormat; decodeFormat = outputFormat; PCMIsBigEndian = outputFormat.isBigEndian(); } else { encode = true; encodeFormat = outputFormat; decodeFormat = inputFormat; PCMIsBigEndian = inputFormat.isBigEndian(); tempBuffer = new byte[tempBufferSize]; } // setup tables according to byte order if (PCMIsBigEndian) { tabByte1 = ULAW_TABH; tabByte2 = ULAW_TABL; highByte = 0; lowByte = 1; } else { tabByte1 = ULAW_TABL; tabByte2 = ULAW_TABH; highByte = 1; lowByte = 0; } // set the AudioInputStream length in frames if we know it if (stream instanceof AudioInputStream) { frameLength = ((AudioInputStream)stream).getFrameLength(); } // set framePos to zero framePos = 0; frameSize = inputFormat.getFrameSize(); if (frameSize == AudioSystem.NOT_SPECIFIED) { frameSize = 1; } }
Example 13
Source File: PCMtoPCMCodec.java From jdk8u-jdk with GNU General Public License v2.0 | 4 votes |
PCMtoPCMCodecStream(AudioInputStream stream, AudioFormat outputFormat) { super(stream, outputFormat, -1); int sampleSizeInBits = 0; AudioFormat.Encoding inputEncoding = null; AudioFormat.Encoding outputEncoding = null; boolean inputIsBigEndian; boolean outputIsBigEndian; AudioFormat inputFormat = stream.getFormat(); // throw an IllegalArgumentException if not ok if ( ! (isConversionSupported(inputFormat, outputFormat)) ) { throw new IllegalArgumentException("Unsupported conversion: " + inputFormat.toString() + " to " + outputFormat.toString()); } inputEncoding = inputFormat.getEncoding(); outputEncoding = outputFormat.getEncoding(); inputIsBigEndian = inputFormat.isBigEndian(); outputIsBigEndian = outputFormat.isBigEndian(); sampleSizeInBits = inputFormat.getSampleSizeInBits(); sampleSizeInBytes = sampleSizeInBits/8; // determine conversion to perform if( sampleSizeInBits==8 ) { if( AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) ) { conversionType = PCM_SWITCH_SIGNED_8BIT; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_SIGNED_8BIT"); } else if( AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) ) { conversionType = PCM_SWITCH_SIGNED_8BIT; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_SIGNED_8BIT"); } } else { if( inputEncoding.equals(outputEncoding) && (inputIsBigEndian != outputIsBigEndian) ) { conversionType = PCM_SWITCH_ENDIAN; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_ENDIAN"); } else if (AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && !inputIsBigEndian && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) && outputIsBigEndian) { conversionType = PCM_UNSIGNED_LE2SIGNED_BE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_UNSIGNED_LE2SIGNED_BE"); } else if (AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && !inputIsBigEndian && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) && outputIsBigEndian) { conversionType = PCM_SIGNED_LE2UNSIGNED_BE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SIGNED_LE2UNSIGNED_BE"); } else if (AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && inputIsBigEndian && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) && !outputIsBigEndian) { conversionType = PCM_UNSIGNED_BE2SIGNED_LE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_UNSIGNED_BE2SIGNED_LE"); } else if (AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && inputIsBigEndian && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) && !outputIsBigEndian) { conversionType = PCM_SIGNED_BE2UNSIGNED_LE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SIGNED_BE2UNSIGNED_LE"); } } // set the audio stream length in frames if we know it frameSize = inputFormat.getFrameSize(); if( frameSize == AudioSystem.NOT_SPECIFIED ) { frameSize=1; } if( stream instanceof AudioInputStream ) { frameLength = stream.getFrameLength(); } else { frameLength = AudioSystem.NOT_SPECIFIED; } // set framePos to zero framePos = 0; }
Example 14
Source File: DummySourceDataLine.java From jdk8u-jdk with GNU General Public License v2.0 | 4 votes |
public float getLevel() { return AudioSystem.NOT_SPECIFIED; }
Example 15
Source File: PCMtoPCMCodec.java From jdk8u-jdk with GNU General Public License v2.0 | 4 votes |
PCMtoPCMCodecStream(AudioInputStream stream, AudioFormat outputFormat) { super(stream, outputFormat, -1); int sampleSizeInBits = 0; AudioFormat.Encoding inputEncoding = null; AudioFormat.Encoding outputEncoding = null; boolean inputIsBigEndian; boolean outputIsBigEndian; AudioFormat inputFormat = stream.getFormat(); // throw an IllegalArgumentException if not ok if ( ! (isConversionSupported(inputFormat, outputFormat)) ) { throw new IllegalArgumentException("Unsupported conversion: " + inputFormat.toString() + " to " + outputFormat.toString()); } inputEncoding = inputFormat.getEncoding(); outputEncoding = outputFormat.getEncoding(); inputIsBigEndian = inputFormat.isBigEndian(); outputIsBigEndian = outputFormat.isBigEndian(); sampleSizeInBits = inputFormat.getSampleSizeInBits(); sampleSizeInBytes = sampleSizeInBits/8; // determine conversion to perform if( sampleSizeInBits==8 ) { if( AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) ) { conversionType = PCM_SWITCH_SIGNED_8BIT; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_SIGNED_8BIT"); } else if( AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) ) { conversionType = PCM_SWITCH_SIGNED_8BIT; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_SIGNED_8BIT"); } } else { if( inputEncoding.equals(outputEncoding) && (inputIsBigEndian != outputIsBigEndian) ) { conversionType = PCM_SWITCH_ENDIAN; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SWITCH_ENDIAN"); } else if (AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && !inputIsBigEndian && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) && outputIsBigEndian) { conversionType = PCM_UNSIGNED_LE2SIGNED_BE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_UNSIGNED_LE2SIGNED_BE"); } else if (AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && !inputIsBigEndian && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) && outputIsBigEndian) { conversionType = PCM_SIGNED_LE2UNSIGNED_BE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SIGNED_LE2UNSIGNED_BE"); } else if (AudioFormat.Encoding.PCM_UNSIGNED.equals(inputEncoding) && inputIsBigEndian && AudioFormat.Encoding.PCM_SIGNED.equals(outputEncoding) && !outputIsBigEndian) { conversionType = PCM_UNSIGNED_BE2SIGNED_LE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_UNSIGNED_BE2SIGNED_LE"); } else if (AudioFormat.Encoding.PCM_SIGNED.equals(inputEncoding) && inputIsBigEndian && AudioFormat.Encoding.PCM_UNSIGNED.equals(outputEncoding) && !outputIsBigEndian) { conversionType = PCM_SIGNED_BE2UNSIGNED_LE; if(Printer.debug) Printer.debug("PCMtoPCMCodecStream: conversionType = PCM_SIGNED_BE2UNSIGNED_LE"); } } // set the audio stream length in frames if we know it frameSize = inputFormat.getFrameSize(); if( frameSize == AudioSystem.NOT_SPECIFIED ) { frameSize=1; } if( stream instanceof AudioInputStream ) { frameLength = stream.getFrameLength(); } else { frameLength = AudioSystem.NOT_SPECIFIED; } // set framePos to zero framePos = 0; }
Example 16
Source File: AbstractDataLine.java From openjdk-jdk8u-backup with GNU General Public License v2.0 | 4 votes |
/** * Constructs a new AbstractLine. */ protected AbstractDataLine(DataLine.Info info, AbstractMixer mixer, Control[] controls) { this(info, mixer, controls, null, AudioSystem.NOT_SPECIFIED); }
Example 17
Source File: AlawCodec.java From TencentKona-8 with GNU General Public License v2.0 | 4 votes |
AlawCodecStream(AudioInputStream stream, AudioFormat outputFormat) { super(stream, outputFormat, -1); AudioFormat inputFormat = stream.getFormat(); // throw an IllegalArgumentException if not ok if ( ! (isConversionSupported(outputFormat, inputFormat)) ) { throw new IllegalArgumentException("Unsupported conversion: " + inputFormat.toString() + " to " + outputFormat.toString()); } //$$fb 2002-07-18: fix for 4714846: JavaSound ULAW (8-bit) encoder erroneously depends on endian-ness boolean PCMIsBigEndian; // determine whether we are encoding or decoding if (AudioFormat.Encoding.ALAW.equals(inputFormat.getEncoding())) { encode = false; encodeFormat = inputFormat; decodeFormat = outputFormat; PCMIsBigEndian = outputFormat.isBigEndian(); } else { encode = true; encodeFormat = outputFormat; decodeFormat = inputFormat; PCMIsBigEndian = inputFormat.isBigEndian(); tempBuffer = new byte[tempBufferSize]; } if (PCMIsBigEndian) { tabByte1 = ALAW_TABH; tabByte2 = ALAW_TABL; highByte = 0; lowByte = 1; } else { tabByte1 = ALAW_TABL; tabByte2 = ALAW_TABH; highByte = 1; lowByte = 0; } // set the AudioInputStream length in frames if we know it if (stream instanceof AudioInputStream) { frameLength = ((AudioInputStream)stream).getFrameLength(); } // set framePos to zero framePos = 0; frameSize = inputFormat.getFrameSize(); if( frameSize==AudioSystem.NOT_SPECIFIED ) { frameSize=1; } }
Example 18
Source File: AiffFileWriter.java From Bytecoder with Apache License 2.0 | 4 votes |
@Override public int write(AudioInputStream stream, AudioFileFormat.Type fileType, File out) throws IOException { Objects.requireNonNull(stream); Objects.requireNonNull(fileType); Objects.requireNonNull(out); // throws IllegalArgumentException if not supported AiffFileFormat aiffFileFormat = (AiffFileFormat)getAudioFileFormat(fileType, stream); // first write the file without worrying about length fields final int bytesWritten; try (final FileOutputStream fos = new FileOutputStream(out); final BufferedOutputStream bos = new BufferedOutputStream(fos)) { bytesWritten = writeAiffFile(stream, aiffFileFormat, bos); } // now, if length fields were not specified, calculate them, // open as a random access file, write the appropriate fields, // close again.... if( aiffFileFormat.getByteLength()== AudioSystem.NOT_SPECIFIED ) { // $$kk: 10.22.99: jan: please either implement this or throw an exception! // $$fb: 2001-07-13: done. Fixes Bug 4479981 int channels = aiffFileFormat.getFormat().getChannels(); int sampleSize = aiffFileFormat.getFormat().getSampleSizeInBits(); int ssndBlockSize = channels * ((sampleSize + 7) / 8); int aiffLength=bytesWritten; int ssndChunkSize=aiffLength-aiffFileFormat.getHeaderSize()+16; long dataSize=ssndChunkSize-16; //TODO possibly incorrect round int numFrames = (int) (dataSize / ssndBlockSize); try (final RandomAccessFile raf = new RandomAccessFile(out, "rw")) { // skip FORM magic raf.skipBytes(4); raf.writeInt(aiffLength - 8); // skip aiff2 magic, fver chunk, comm magic, comm size, channel count, raf.skipBytes(4 + aiffFileFormat.getFverChunkSize() + 4 + 4 + 2); // write frame count raf.writeInt(numFrames); // skip sample size, samplerate, SSND magic raf.skipBytes(2 + 10 + 4); raf.writeInt(ssndChunkSize - 8); // that's all } } return bytesWritten; }
Example 19
Source File: AbstractLine.java From jdk8u-jdk with GNU General Public License v2.0 | 2 votes |
/** * Return the frame position in a long value * This implementation returns AudioSystem.NOT_SPECIFIED. */ public long getLongFramePosition() { return AudioSystem.NOT_SPECIFIED; }
Example 20
Source File: AbstractLine.java From jdk8u-jdk with GNU General Public License v2.0 | 2 votes |
/** * Return the frame position in a long value * This implementation returns AudioSystem.NOT_SPECIFIED. */ public long getLongFramePosition() { return AudioSystem.NOT_SPECIFIED; }