Context:
I have an API endpoint which serves .stl files stored in a public Google Cloud Storage bucket. Until sometime this week, it was working fine. I can step all the way through my code with the debugger. The NPM module in question is not even referenced in my project. I've tried using the exact code on Googles documentation for download and get the same exception: https://github.com/googleapis/nodejs-storage/blob/master/samples/downloadFile.js
What I've tried
npm rebuild @google-cloud/storage
and different ways of using the same Google npm package.
Questions:
1.) Shouldn't the catch, catch the exception to prevent the crash?
2.) Anyone have any ideas on a workaround?
File: https://storage.cloud.google.com/fancy_induction_ui/1inX1in.stl
Code:
getFile: async (req, res) => {
try {
const fileName = req.param('file');
res.setHeader('Content-Type', 'application/octet-stream');
res.setHeader('Content-Disposition', 'attachment; filename=' + fileName + '');
const storage = new Storage();
let file = await storage.bucket('fancy_induction_ui').file(fileName).createReadStream();
file.pipe(res);
} catch (e) {
res.status(500).json({message: 'Something is wrong!', err: e.message});
}
}
Stacktrace:
path/to/code/node_modules/readable-stream/lib/_stream_writable.js:317
var isBuf = !state.objectMode && _isUint8Array(chunk);
^
TypeError: Cannot read property 'objectMode' of undefined
at DestroyableTransform.Writable.write (path/to/code/node_modules/readable-stream/lib/_stream_writable.js:317:22)
at PassThrough.ondata (_stream_readable.js:714:22)
at PassThrough.emit (events.js:321:20)
at PassThrough.EventEmitter.emit (domain.js:482:12)
at PassThrough.Readable.read (_stream_readable.js:512:10)
at flow (_stream_readable.js:989:34)
at resume_ (_stream_readable.js:970:3)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
from @Google-cloud/storage suddenly throwing weird exception after working for a long time
No comments:
Post a Comment