Saturday, 25 March 2023

Filter on json data with in_() sqlalchemy

I want to filter features(Json field on database) with items = a or b, but here it returns 0, when I use other columns the filter works correctly. It returns correct data with ["a"] or ["b"] too, what is the reason? and what is the solution?

data.filter(Data.id.in_([1,2])) #works

data.filter(Data.features['items'].in_(["a"])) # returns 3

data.filter(Data.features['items'].in_(["b"])) # returns 1

data.filter(Data.features['items'].in_(["a","b"])) # returns 0 I exepect 4


from Filter on json data with in_() sqlalchemy

Keras: time per step increases with a filter on the number of samples, epoch time continues the same

I'm implementing a simple sanity check model on Keras for some data I have. My training dataset is comprised of about 550 files, and each contributes to about 150 samples. Each training sample has the following signature:

({'input_a': TensorSpec(shape=(None, 900, 1), dtype=tf.float64, name=None),
  'input_b': TensorSpec(shape=(None, 900, 1), dtype=tf.float64, name=None)},
   TensorSpec(shape=(None, 1), dtype=tf.int64, name=None)
)

Essentially, each training sample is made up of two inputs with shape (900, 1), and the target is a single (binary) label. The first step of my model is a concatenation of inputs into a (900, 2) Tensor.

The total number of training samples is about 70000.

As input to the model, I'm creating a tf.data.Dataset, and applying a few preparation steps:

  1. tf.Dataset.filter: to filter some samples with invalid labels
  2. tf.Dataset.shuffle
  3. tf.Dataset.filter: to undersample my training dataset
  4. tf.Dataset.batch

Step 3 is the most important in my question. To undersample my dataset I apply a simple function:

def undersampling(dataset: tf.data.Dataset, drop_proba: Iterable[float]) -> tf.data.Dataset:
    def undersample_function(x, y):

        drop_prob_ = tf.constant(drop_proba)

        idx = y[0]

        p = drop_prob_ [idx]
        v = tf.random.uniform(shape=(), dtype=tf.float32)

        return tf.math.greater_equal(v, p)

    return dataset.filter(undersample_function)

Essentially, the function accepts a a vector of probabilities drop_prob such that drop_prob[l] is the probability of dropping a sample with label l (the function is a bit convoluted, but it's the way I found to implement it as Dataset.filter). Using equal probabilities, say drop_prob=[0.9, 0.9], I`ll be dropping about 90% of my samples.

Now, the thing is, I've been experimenting with different undersamplings for my dataset, in order to find a sweet spot between performance and training time, but when I undersample, the epoch duration is the same, with time/step increasing instead.

Keeping my batch_size fixed at 20000, for the complete dataset I have a total of 4 batches, and the following time for an average epoch:

Epoch 4/1000
1/4 [======>.......................] - ETA: 9s
2/4 [==============>...............] - ETA: 5s
3/4 [=====================>........] - ETA: 2s
4/4 [==============================] - ETA: 0s
4/4 [==============================] - 21s 6s/step

While if I undersample my dataset with a drop_prob = [0.9, 0.9] (That is, Im getting rid of about 90% of the dataset), and keeping the same batch_size` of 20000, I have 1 batch, and the following time for an average epoch:

Epoch 4/1000
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 22s 22s/step 

Notice that while the number of batches is only 1, the epoch time is the same! It just takes longer to process the batch.

Now, as a sanity check, I tried a different way of undersampling, by filtering the files instead. So I selected about 55 of the training files (10%), to have a similar number of samples in a single batch, and removed the undersampling from the tf.Dataset. The epoch time decreates as expected:

Epoch 4/1000
1/1 [==============================] - ETA: 0s
1/1 [==============================] - 2s 2s/step 

Note that the original dataset has 70014 training samples, while the undersampled dataset by means of tf.Dataset.filter had 6995 samples and the undersampled dataset by means of file filtering had 7018 samples, thus the numbers are consistent.

Much faster. In fact, it takes about 10% of the time as the epoch takes with the full dataset. So there is an issue with the way I'm performing undersampling (by using tf.data.Dataset.filter) when creating the tf.Dataset, I would like to ask for help to figure it out what is the issue. Thanks.



from Keras: time per step increases with a filter on the number of samples, epoch time continues the same

Position:absolute issue in react native

I am making a react native application in which there is a left and right section.

The left section consists of flex:0.7 and right side section consists of flex:0.2.

Inside the left section I have a container inside which there is a ImageBackground which looks like a circuit skeleton

enter image description here

and inside that I am in the need to place sub components in the respective position.

Expected Result:

enter image description here

Things I have tried:

Pure HTML and CSS way: (Working as expected)

.container {
  display: flex;
  flex: 1;
  flex-direction: row;
  justify-content: space-between;
  align-items: center;
}

.leftSection {
  flex: 0.7;
}

.rightSection {
  flex: 0.2;
  background-color: #ccc;
}

.bgContainer {
  background-repeat: no-repeat;
  position: relative;
  margin: 0 auto;
}

.bg-img {
  display: block;
  width: 100%;
}

.coil {
  position: absolute;
  top: 49.55%;
  left: 24.3%;
  width: 17.4418605%;
}

.evaporator {
  position: absolute;
  top: 7.25%;
  left: 54.5%;
  width: 11.627907%;
}

.compressor {
  position: absolute;
  top: 53.15%;
  left: 59.2%;
  width: 13.0813953%;
}

.component img {
  display: block;
  width: 100%;
}
<div class="container">
  <div class="leftSection">
    <div class="bgContainer">
      <img src="https://i.stack.imgur.com/AfygH.png" class="bg-img" />
      <div class="component coil">
        <img src="https://i.stack.imgur.com/SKUms.png" alt="coil-image" />
      </div>
      <div class="component evaporator">
        <img src="https://i.stack.imgur.com/spv58.png" alt="evaporator-image" />
      </div>
      <div class="component compressor">
        <img src="https://i.stack.imgur.com/fzSaH.png" alt="compressor-image" />
      </div>
    </div>
  </div>
  <div class="rightSection">
    Right Section
  </div>
</div>

But as I am doing this in a react native application, I have tried changing this into a react native way like,

import React from 'react';
import { View, Image, StyleSheet, Text, ImageBackground } from 'react-native';

const styles = StyleSheet.create({
  container: {
    flex: 1,
    flexDirection: 'row',
    justifyContent: 'space-between',
    alignItems: 'center',
  },
  leftSection: {
    flex: 0.7,
  },
  rightSection: {
    flex: 0.2,
    backgroundColor: '#ccc',
  },
  bgContainer: {
    position: 'relative',
    margin: 0,
  },
  bgImg: {
    width: '100%',
  },
  coil: {
    position: 'absolute',
    top: '49.55%',
    left: '24.3%',
    width: '17.4418605%',
  },
  evaporator: {
    position: 'absolute',
    top: '7.25%',
    left: '54.5%',
    width: '11.627907%',
  },
  compressor: {
    position: 'absolute',
    top: '53.15%',
    left: '59.2%',
    width: '13.0813953%',
  },
  componentImg: {
    width: '100%',
  },
});

const App = () => {
  return (
    <View style={styles.container}>
      <View style={styles.leftSection}>
        <View style={styles.bgContainer}>
          <ImageBackground
            source=
            style={styles.bgImg}
          >
          <View style={styles.coil}>
            <Image
              source=
              style={styles.componentImg}
            />
          </View>
          <View style={styles.evaporator}>
            <Image
              source=
              style={styles.componentImg}
            />
          </View>
          <View style={styles.compressor}>
            <Image
              source=
              style={styles.componentImg}
            />
          </View>
          </ImageBackground>
        </View>
      </View>
      <View style={styles.rightSection}>
        <Text>Right Section</Text>
      </View>
    </View>
  );
};

export default App;

Issue:

After the implementation,

The below screenshot is captured in the screen viewport with height: 844 and width: 1280

enter image description here

The below screenshot is captured in the screen viewport with height: 552 and width: 1024

enter image description here

I am making this mainly for tablet screens of all height and width but in pure HTML and CSS way, this is responsive but in tablet screen's in react native, its not responsive at all.

Kindly please help me to solve this issue of making the position:absolute elements responsive and located at same position across all the screen's without distortion.

Note: Edited my question to mention that this implementation is happening in react-native.



from Position:absolute issue in react native