fixing error: undefined macro: AC_PROG_LIBTOOL
May 17th, 2007 by Lawrence David
i was getting this error while trying to compile faad2:
configure.in:14: error: possibly undefined macro: AC_PROG_LIBTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
autoreconf: /usr/bin/autoconf failed with exit status: 1
i made this error go away by install “libtool:”
$ sudo apt-get install libtool
Thanks! Exactly the error I was getting, all because of missing libtool. faad is a bitch to install!
Thanks!
configure.in:50: error: possibly undefined macro: AC_PROG_LIBTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
autoreconf: /sw/bin/autoconf-2.59 failed with exit status: 1
thanks for giving solution….i was fighting with this error from one week..
Hi I am having the same problem however I do not have the apt-get tool installed on my server.
Is there another resolution to this problem?
Thanks
thanks, that fixed my building issue
Thanks
It save my previous time.
Regards,
Légion.
Thanks !
I just wanted install faad2(codecs) for ffmpeg,having the same problem !
It takes me nearly one hour ~~~~
Best Regards~
Yea Google! Ran into the same problem on Ubuntu 6.06 server. Thanks!
Awesome need to query
configure.in:14: error: possibly undefined macro: AC_PROG_LIBTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
Many thanks. Worked for me.
————— EXPLANATION —————–
that’s right
to fix the error mentioned you need to install libtool
but that’s not all
it might happen that you have to install it from sources
in this case you MUST SPECIFY prefix for configure script:
./configure –prefix=/usr
otherwise you have to deal with paths somehow to make aoutoconf see your thirdparty macros, which libtool contains
PS/ took an hour to find it out myself
anyway thanx for giving a tip
Thanks! Exactly what was needed.
I much appreciate you taking time to post solutions!
–B
Exactly the problem I had (though on OpenSuse). Thanks!
Thanks!
Woohoo, thanks a lot, you save me man
brilliant – just the answer i needed! (Centos 5.2)
Very glad that I stumbled on your post. Your answer was helpful, thank you.
I’d like to say, however, that a “possibly undefined macro” doesn’t even *remotely* spell “you need libtool” to me.
One more reason I truly dislike automake/autoconf.
thx!!
Thx!! hit the exact issue!
Yabadabaduuuu !
Thanks a lot !
thnx for help
[...] able to fix it easily by installing libtool (in ubuntu: sudo apt-get install libtool -thanks to this site for the info) and then tried again and it worked fine. Now, this next step will separate [...]
I had the same problem while attempting to install faad2-2.7. Thank you for the solution.
The faad2 author should perform his own tests during installation and not reply on his users to find this post, if you ask me.
Ran into this problem while trying to install PET (http://wiki.delph-in.net/moin/PetTop). Your fix worked perfectly. Many thanks!
LOVE IT
Thanks, this was very helpful!
First google hit when searching for the same error and the fix is valid, so … bump to keep it in the first place
many thanks!
Thanks, fixed my problem too (while trying to install gPhoto from SVN)
thanks
Solved, same error WITH CyberlinkCC / UBUNTU 11.04 :
~/bin/CyberLinkCC$ ./bootstrap
configure.in:70: error: possibly undefined macro: AC_PROG_LIBTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
libtool install solved the issue.
thank you !
Getting this error all over the place jhbuild’ing gnome 3.4 I have libtool installed, so this isn’t the only problem.
Thank you so much – I ran into the same error when installing a rubygem (rbczmq)
configure.in:54: error: possibly undefined macro: AC_LIBTOOL_WIN32_DLL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
configure.in:55: error: possibly undefined macro: AC_PROG_LIBTOOL
configure:5242: error: possibly undefined macro: AC_DISABLE_STATIC
configure:5246: error: possibly undefined macro: AC_ENABLE_STATIC
and this (well, I used brew install libtool) fixed it for me. I was in for hours more frustration without you!
you rock!
Thanks. Your post helped me fix my build issue.
Thank you!
Thank you very much!
Thank you,,, its very helpfull
Nicely done!
still working ! (installing gift and apollon)
You just saved me hours of figuring out what was going on.
I appreciate it!!
Thanks it resolved the problem
Dirt Cheap Diamonds Jewelry
fixing error: undefined macro: AC_PROG_LIBTOOL » from the desk of stinkpot
Many thanks – Hadoop is a pig of a java program.
Thanks! Works!
it works, thx a lot
Thanks a lot for this post! I had a problem unrelated to faad2, but with same error and it helped me build ZFS with SPL on Centos.
Thanks! Still helpful 7 years later, haha!
not working for me, I already have libtool installed
ubuntu 15
when build hadoop-0.20.3 i i get this error
Administrator@origianl-PC /cygdrive/e/hadoop_hce_v1/hadoop-0.20.3
$ sh build.sh
/cygdrive/e/hadoop_hce_v1/hadoop-0.20.3
Buildfile: build.xml
clean-contrib:
clean:
clean:
[echo] contrib: capacity-scheduler
clean:
[echo] contrib: datajoin
clean:
[echo] contrib: eclipse-plugin
clean:
[echo] contrib: failmon
clean:
[echo] contrib: fairscheduler
check-libhdfs-fuse:
clean:
Trying to override old definition of task macro_tar
clean:
[echo] contrib: hdfsproxy
clean:
[echo] contrib: hod
clean:
[echo] contrib: index
clean:
[echo] contrib: streaming
clean:
[echo] contrib: thriftfs
clean:
[echo] contrib: vaidya
clean:
BUILD SUCCESSFUL
Total time: 2 seconds
Buildfile: build.xml
clover.setup:
clover.info:
[echo]
[echo] Clover not found. Code coverage reports disabled.
[echo]
clover:
ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ ivy-2.0.0-rc2.jar
[get] To: E:\hadoop_hce_v1\hadoop-0.20.3\ivy\ivy-2.0.0-rc2.jar
[get] Not modified – so not downloaded
ivy-init-dirs:
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\ivy
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\ivy\lib
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\ivy\report
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\ivy\maven
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 – 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = E:\hadoop_hce_v1\hadoop-0.20.3\ivy\ivysettings.xml
ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#Hadoop;working@orig ianl-PC
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found log4j#log4j;1.2.15 in maven2
[ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] found commons-cli#commons-cli;1.2 in maven2
[ivy:resolve] found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] found net.java.dev.jets3t#jets3t;0.6.1 in maven2
[ivy:resolve] found commons-net#commons-net;1.4.1 in maven2
[ivy:resolve] found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] found oro#oro;2.0.8 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] found tomcat#jasper-runtime;5.5.12 in maven2
[ivy:resolve] found tomcat#jasper-compiler;5.5.12 in maven2
[ivy:resolve] found commons-el#commons-el;1.0 in maven2
[ivy:resolve] found junit#junit;3.8.1 in maven2
[ivy:resolve] found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] found org.mockito#mockito-all;1.8.0 in maven2
[ivy:resolve] :: resolution report :: resolve 1860ms :: artifacts dl 60ms
———————————————————————
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
———————————————————————
| common | 21 | 0 | 0 | 0 || 21 | 0 |
———————————————————————
ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#Hadoop
[ivy:retrieve] confs: [common]
[ivy:retrieve] 21 artifacts copied, 0 already retrieved (7653kB/950ms)
No ivy:settings found for the default reference ‘ivy.instance’. A default insta nce will be used
DEPRECATED: ‘ivy.conf.file’ is deprecated, use ‘ivy.settings.file’ instead
:: loading settings :: file = E:\hadoop_hce_v1\hadoop-0.20.3\ivy\ivysettings.xml
init:
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\classes
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\tools
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\src
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps\task\WEB-I NF
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps\job\WEB-IN F
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps\hdfs\WEB-I NF
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps\datanode\W EB-INF
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps\secondary\ WEB-INF
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\examples
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\ant
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\c++
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\test
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\test\classes
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\test\testjar
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\test\testshell
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\test\extraconf
[touch] Creating E:\phd2014\d.adly\cygwin–\tmp\null2122987717
[delete] Deleting: E:\phd2014\d.adly\cygwin–\tmp\null2122987717
[copy] Copying 7 files to E:\hadoop_hce_v1\hadoop-0.20.3\build\webapps
[exec] svn: E155007: ‘/cygdrive/e/hadoop_hce_v1/hadoop-0.20.3′ is not a wor king copy
[exec] svn: E155007: ‘/cygdrive/e/hadoop_hce_v1/hadoop-0.20.3′ is not a wor king copy
record-parser:
compile-rcc-compiler:
[javac] Compiling 29 source files to E:\hadoop_hce_v1\hadoop-0.20.3\build\cl asses
compile-core-classes:
[javac] Compiling 300 source files to E:\hadoop_hce_v1\hadoop-0.20.3\build\c lasses
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[copy] Copying 1 file to E:\hadoop_hce_v1\hadoop-0.20.3\build\classes
compile-mapred-classes:
[javac] Compiling 293 source files to E:\hadoop_hce_v1\hadoop-0.20.3\build\c lasses
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:383: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(jvm.toString());
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:385: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(“-classpath”);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:386: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(“\”" + System.getProperty(“java.class.path”) + “\”");
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:389: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(“-Xmx” + Runtime.getRuntime().maxMemory());
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:392: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(main.getName());
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\StreamUtil.java:394: warning: [unchecked] unchecked call to add(E) as a membe r of the raw type java.util.ArrayList
[javac] vargs.add(argv[i]);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:200: warning: [unchecked] unchecked call to add(E) as a member of the raw type java.util.ArrayList
[javac] shippedCanonFiles_.add(f.getCanonicalPath());
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:261: warning: [unchecked] unchecked call to addAll(java.util.C ollection) as a member of the raw type java.util.ArrayList
[javac] inputSpecs_.addAll(cmdLine.getValues(“-input”));
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:269: warning: [unchecked] unchecked call to addAll(java.util.C ollection) as a member of the raw type java.util.ArrayList
[javac] packageFiles_.addAll(cmdLine.getValues(“-file”));
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:283: warning: [unchecked] unchecked conversion
[javac] found : java.util.List
[javac] required: java.util.List
[javac] List car = cmdLine.getValues(“-cacheArchive”);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:291: warning: [unchecked] unchecked conversion
[javac] found : java.util.List
[javac] required: java.util.List
[javac] List caf = cmdLine.getValues(“-cacheFile”);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:299: warning: [unchecked] unchecked cast
[javac] found : java.lang.Object
[javac] required: java.util.List
[javac] List jobConfArgs = (List)cmdLine.getValue(jobc onf);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:300: warning: [unchecked] unchecked cast
[javac] found : java.lang.Object
[javac] required: java.util.List
[javac] List envArgs = (List)cmdLine.getValue(cmdenv);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:371: warning: [unchecked] unchecked cast
[javac] found : java.util.List
[javac] required: java.util.List
[javac] for (String file : (List)values) {
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:396: warning: [unchecked] unchecked cast
[javac] found : java.util.List
[javac] required: java.util.List
[javac] for (String file : (List)values) {
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:620: warning: [unchecked] unchecked call to add(E) as a member of the raw type java.util.ArrayList
[javac] packageFiles_.add(runtimeClasses);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:622: warning: [unchecked] unchecked call to add(E) as a member of the raw type java.util.ArrayList
[javac] unjarFiles.add(runtimeClasses);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:728: warning: [unchecked] unchecked conversion
[javac] found : java.lang.Class
[javac] required: java.lang.Class
[javac] jobConf_.setInputFormat(fmt);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:802: warning: [unchecked] unchecked call to put(K,V) as a memb er of the raw type java.util.TreeMap
[javac] sorted.put(en.getKey(), en.getValue());
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:934: warning: [unchecked] unchecked call to add(E) as a member of the raw type java.util.ArrayList
[javac] properties.add(next);
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:942: warning: [unchecked] unchecked cast
[javac] found : java.lang.Object
[javac] required: java.util.List
[javac] List oldVal = (List)commandLine.getValue(this) ;
[javac] ^
[javac] E:\hadoop_hce_v1\hadoop-0.20.3\src\mapred\org\apache\hadoop\mapred\h ce\Submitter.java:946: warning: [unchecked] unchecked conversion
[javac] found : java.util.ArrayList
[javac] required: java.util.Collection
[javac] oldVal.addAll(properties);
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 22 warnings
[copy] Copying 3 files to E:\hadoop_hce_v1\hadoop-0.20.3\build\classes
compile-hdfs-classes:
[javac] Compiling 120 source files to E:\hadoop_hce_v1\hadoop-0.20.3\build\c lasses
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[copy] Copying 1 file to E:\hadoop_hce_v1\hadoop-0.20.3\build\classes
compile-core-native:
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\native\Windows_7-x 86-32\lib
[mkdir] Created dir: E:\hadoop_hce_v1\hadoop-0.20.3\build\native\Windows_7-x 86-32\src\org\apache\hadoop\io\compress\zlib
[javah] [Search path = E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\resources. jar;E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\rt.jar;E:\hadoop_hce_v1\java6\jdk 1.6.0_45\jre\lib\sunrsasign.jar;E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\jsse. jar;E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\jce.jar;E:\hadoop_hce_v1\java6\jd k1.6.0_45\jre\lib\charsets.jar;E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\module s\jdk.boot.jar;E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\classes;E:\hadoop_hce_v1\h adoop-0.20.3\build\classes]
[javah] [Loaded E:\hadoop_hce_v1\hadoop-0.20.3\build\classes\org\apache\hado op\io\compress\zlib\ZlibCompressor.class]
[javah] [Loaded E:\hadoop_hce_v1\java6\jdk1.6.0_45\jre\lib\rt.jar(java/lang/ Object.class)]
[javah] [Forcefully writing file E:\hadoop_hce_v1\hadoop-0.20.3\build\native \Windows_7-x86-32\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_co mpress_zlib_ZlibCompressor.h]
[javah] [Loaded E:\hadoop_hce_v1\hadoop-0.20.3\build\classes\org\apache\hado op\io\compress\zlib\ZlibDecompressor.class]
[javah] [Forcefully writing file E:\hadoop_hce_v1\hadoop-0.20.3\build\native \Windows_7-x86-32\src\org\apache\hadoop\io\compress\zlib\org_apache_hadoop_io_co mpress_zlib_ZlibDecompressor.h]
[exec] checking for a BSD-compatible install… /usr/bin/install -c
[exec] checking whether build environment is sane… yes
[exec] checking for gawk… gawk
[exec] checking whether make sets $(MAKE)… yes
[exec] checking for gcc… gcc
[exec] checking for C compiler default output file name… a.exe
[exec] checking whether the C compiler works… yes
[exec] checking whether we are cross compiling… no
[exec] checking for suffix of executables… .exe
[exec] checking for suffix of object files… o
[exec] checking whether we are using the GNU C compiler… yes
[exec] checking whether gcc accepts -g… yes
[exec] checking for gcc option to accept ANSI C… none needed
[exec] checking for style of include used by make… GNU
[exec] checking dependency style of gcc… gcc3
[exec] checking build system type… i686-pc-cygwin
[exec] checking host system type… i686-pc-cygwin
[exec] checking for a sed that does not truncate output… /usr/bin/sed
[exec] checking for egrep… grep -E
[exec] checking for ld used by gcc… /usr/i686-pc-cygwin/bin/ld.exe
[exec] checking if the linker (/usr/i686-pc-cygwin/bin/ld.exe) is GNU ld… yes
[exec] checking for /usr/i686-pc-cygwin/bin/ld.exe option to reload object files… -r
[exec] checking for BSD-compatible nm… /usr/bin/nm -B
[exec] checking whether ln -s works… yes
[exec] checking how to recognise dependent libraries… file_magic ^x86 arc hive import|^x86 DLL
[exec] checking how to run the C preprocessor… gcc -E
[exec] checking for ANSI C header files… yes
[exec] checking for sys/types.h… yes
[exec] checking for sys/stat.h… yes
[exec] checking for stdlib.h… yes
[exec] checking for string.h… yes
[exec] checking for memory.h… yes
[exec] checking for strings.h… yes
[exec] checking for inttypes.h… yes
[exec] checking for stdint.h… yes
[exec] checking for unistd.h… yes
[exec] checking dlfcn.h usability… yes
[exec] checking dlfcn.h presence… yes
[exec] checking for dlfcn.h… yes
[exec] checking for g++… g++
[exec] checking whether we are using the GNU C++ compiler… yes
[exec] checking whether g++ accepts -g… yes
[exec] checking dependency style of g++… gcc3
[exec] checking how to run the C++ preprocessor… g++ -E
[exec] checking for g77… no
[exec] checking for f77… no
[exec] checking for xlf… no
[exec] checking for frt… no
[exec] checking for pgf77… no
[exec] checking for fort77… no
[exec] checking for fl32… no
[exec] checking for af77… no
[exec] checking for f90… no
[exec] checking for xlf90… no
[exec] checking for pgf90… no
[exec] checking for epcf90… no
[exec] checking for f95… f95
[exec] checking whether we are using the GNU Fortran 77 compiler… yes
[exec] checking whether f95 accepts -g… yes
[exec] checking the maximum length of command line arguments… 8192
[exec] checking command to parse /usr/bin/nm -B output from gcc object… o k
[exec] checking for objdir… .libs
[exec] checking for ar… ar
[exec] checking for ranlib… ranlib
[exec] checking for strip… strip
[exec] checking if gcc static flag works… yes
[exec] checking if gcc supports -fno-rtti -fno-exceptions… no
[exec] checking for gcc option to produce PIC…
[exec] checking if gcc supports -c -o file.o… yes
[exec] checking whether the gcc linker (/usr/i686-pc-cygwin/bin/ld.exe) sup ports shared libraries… yes
[exec] checking whether -lc should be explicitly linked in… yes
[exec] checking dynamic linker characteristics… Win32 ld.exe
[exec] checking how to hardcode library paths into programs… immediate
[exec] checking whether stripping libraries is possible… yes
[exec] checking if libtool supports shared libraries… yes
[exec] checking whether to build shared libraries… yes
[exec] checking whether to build static libraries… yes
[exec] configure: creating libtool
[exec] appending configuration tag “CXX” to libtool
[exec] checking for ld used by g++… /usr/i686-pc-cygwin/bin/ld.exe
[exec] checking if the linker (/usr/i686-pc-cygwin/bin/ld.exe) is GNU ld… yes
[exec] checking whether the g++ linker (/usr/i686-pc-cygwin/bin/ld.exe) sup ports shared libraries… yes
[exec] checking for g++ option to produce PIC…
[exec] checking if g++ supports -c -o file.o… yes
[exec] checking whether the g++ linker (/usr/i686-pc-cygwin/bin/ld.exe) sup ports shared libraries… yes
[exec] checking dynamic linker characteristics… Win32 ld.exe
[exec] checking how to hardcode library paths into programs… immediate
[exec] checking whether stripping libraries is possible… yes
[exec] appending configuration tag “F77″ to libtool
[exec] checking if libtool supports shared libraries… yes
[exec] checking whether to build shared libraries… yes
[exec] checking whether to build static libraries… yes
[exec] checking for f95 option to produce PIC…
[exec] checking if f95 supports -c -o file.o… yes
[exec] checking whether the f95 linker (/usr/i686-pc-cygwin/bin/ld.exe) sup ports shared libraries… yes
[exec] checking dynamic linker characteristics… Win32 ld.exe
[exec] checking how to hardcode library paths into programs… immediate
[exec] checking whether stripping libraries is possible… yes
[exec] checking for dlopen in -ldl… yes
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm… no
[exec] checking for ANSI C header files… (cached) yes
[exec] checking stdio.h usability… yes
[exec] checking stdio.h presence… yes
[exec] checking for stdio.h… yes
[exec] checking stddef.h usability… yes
[exec] checking stddef.h presence… yes
[exec] checking for stddef.h… yes
[exec] find: ‘E:/hadoop_hce_v1/java6/jdk1.6.0_45/jre/include’: No such file or directory
[exec] checking jni.h usability… yes
[exec] checking jni.h presence… yes
[exec] checking for jni.h… yes
[exec] checking zlib.h usability… yes
[exec] checking zlib.h presence… yes
[exec] checking for zlib.h… yes
[exec] checking Checking for the ‘actual’ dynamic-library for ‘-lz’…
[exec] checking zconf.h usability… yes
[exec] checking zconf.h presence… yes
[exec] checking for zconf.h… yes
[exec] checking Checking for the ‘actual’ dynamic-library for ‘-lz’… (cac hed)
[exec] checking for an ANSI C-conforming const… yes
[exec] checking for memset… yes
[exec] configure: creating ./config.status
[exec] config.status: creating Makefile
[exec] config.status: error: cannot find input file: Makefile.in
BUILD FAILED
E:\hadoop_hce_v1\hadoop-0.20.3\build.xml:476: exec returned: 1
how solve this ,please help me.
Thanks so much.you saved me also
thank you very mutch
worked fine, thanks.
Hi,
This has worked for me too.
I was compiling qemu under Linux Mint.
Thanks a lot
Hi, ï½™up this pÑ–ece of writing is genuinely niÑe and I have
learned lot of things from it regarding blogging. thanks.
Hi everyone, it’s my first go to see at this website, and post is
genuinely fruitful in favor of me, keep up posting these posts.
Very good blog post. I definitely appreciate this site.
Keep writing!