Take the given example code:
<?php
if (! function_exists('human_filesize')) {
function human_filesize($size, $precision = 2, $step = 1000)
{
$i = 0;
$units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];
while (($size / $step) > 0.9) {
$size = $size / $step;
$i++;
}
return round($size, $precision) . ' ' . $units[$i];
}
}
if (! function_exists('dd')) {
function dd($vars)
{
foreach (func_get_args() as $var) {
var_dump($var);
}
die();
}
}
$start = microtime(true);
$usage = memory_get_usage(true);
require "brown_corpus.php"; // It's 1.6 MB
$dump[] = round(microtime(true) - $start, 3);
$dump[] = human_filesize(memory_get_usage(true) - $usage);
dd(...$dump); // 0.063ms to run | 38.01 MB memory used
brown_corpus.php is 1.6 MB, but when it's required the script tells me it's using 38.01 MB in memory. I've been doing some reading and I'm wondering if this is because PHP compiles required files into opcode, for faster execution? Can someone enlighten be on the pros and cons of this... i.e. if I go ahead and search for keys within an array in that requirement, is that now faster - because of the way PHP has compiled the file?
from PHP uses more memory than the file required
No comments:
Post a Comment