Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
2.0k views
in Technique[技术] by (71.8m points)

linux - ELF Header or installation issue with bcrypt in Docker container

Kind of a longshot, but has anyone had any problems using bcrypt in a linux container (specifically docker) and know of an automated workaround? I have the same issue as these two:

Invalid ELF header with node bcrypt on AWSBox

bcrypt invalid elf header when running node app

My Dockerfile

# Pull base image
FROM node:0.12

# Expose port 8080
EXPOSE 8080

# Add current directory into path /data in image
ADD . /data

# Set working directory to /data
WORKDIR /data

# Install dependencies from package.json
RUN npm install --production

# Run index.js
CMD ["npm", "start"]

I get the previously mentioned invalid ELF header error if I have bcrypt already installed in my node_modules, but if I remove it (either just itself or all my packages), it isn't installed for some reason when I build the container. I have to manually enter the container after the build and install it inside.

Is there an automated workaround?

Or maybe, just, what would be a good alternative to bcrypt with a Node stack?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Liam's comment is on the money, just expanding on it for future travellers on the internets.

The issue is that you've copied your node_modules folder into your container. The reason that this is a problem is that bcrypt is a native module. It's not just javascript, but also a bunch of C code that gets compiled at the time of installation.

The binaries that come out of that compilation get stored in the node_modules folder and they're customised to the place they were built. Transplanting them out of their OSX home into a strange Linux land causes them to misbehave and complain about ELF headers and fairy feet.

The solution is to echo node_modules >> .dockerignore and run npm install as part of your Dockerfile. This means that the native modules will be compiled inside the container rather than outside it on your laptop.

With this in place, there is no need to run npm install before your start CMD. Just having it in the build phase of the Dockerfile is fine.

protip: the official node images set NODE_ENV=production by default, which npm treats the same as the --production flag. Most of the time this is a good thing. It is not a good thing when your Dockerfile also contains some build steps that rely on dev dependencies (webpack, etc). In that case you want NODE_ENV=null npm install

pro protip: you can take better advantage of Docker's caching by copying in your package.json separately to the rest of your code. Make your Dockerfile look like this:

# Pull base image
FROM node:0.12

# Expose port 8080
EXPOSE 8080

# Set working directory to /data
WORKDIR /data

# Set working directory to /data
COPY package.json /data

# Install dependencies from package.json
RUN npm install

# Add current directory into path /data in image
ADD . /data

# Run index.js
CMD npm start

And that way Docker will only re-run npm install when you change your package.json, not every time you change a line of code.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...