Sunday, 1 October 2023

Watch application screen size changes in Capacitor/Cordova

The real-world use case is to detect split screen and tablet fold when Capacitor application runs. It looks like Android handles this with "configuration changes", while there seems to be no explicit mention of it in Capacitor APIs.

How can the changes in application screen size be watched?

Can this reliably be done with WebView window resize event?



from Watch application screen size changes in Capacitor/Cordova

Composing Svelte Components in JavaScript/TypeScript runtime

Let's say we have a Button Component that takes props variant: 'primary'|'secondary'

<Button variant='primary' on:click={()=>console.log('hello')}> click </Button>

I want to create a PrimaryButton Component that has all props and actions from Button Component but override the props with default values. I also want to do this without creating a new svelte file.

I was able to get it almost working with the following code

class PrimaryButton extends Button {
  constructor(options: ComponentConstructorOptions<Omit<ComponentProps<Button>, 'variant'>>) {
     super({...options, props: {variant: 'orange', ...options.props}})
  }
}
const PrimaryButton = extendComponent<Button>(Button, {variant: 'orange'})
<PrimaryButton on:click={()=>console.log('yay it works')}>click</PrimaryButton>

Above code gives typescript error if Button Component have other required props that I am not setting a default to.

  1. How do I fix my code to make it also work without giving defaults to all required props?

  2. How do I create a generic function that does the above so I can use it like the following with correct types.

const PrimaryButton = extendComponent(Button, {variant:'orange'})

I got close but prop types are not working

export function extendComponent<T extends SvelteComponent>(
  Comp: ComponentType,
  props: Partial<ComponentProps<T>> = {}
) {
  return class ExtendedComponent extends Comp {
    constructor(options: ComponentConstructorOptions<Omit<ComponentProps<T>, keyof typeof props>>) {
      super({...options, props: {...props, ...options.props}})
    }
  } as unknown as ComponentType<SvelteComponent<Partial<ComponentProps<T>>>>
}
const PrimaryButton = extendComponent<Button>(Button, {variant: 'orange'})
  1. Is it possible to give default on:click during runtime like how I am doing with props?

Thanks!



from Composing Svelte Components in JavaScript/TypeScript runtime

Building a customized Lasagne layer whose output is a matrix of the elementwise product (input x weight) and not the dot product

I have an input sequence with shape (seq_length(19) x Features(21)), which I feed as an input to neural network.

I need a layer to perform an elementwise multiplication on inputs with weights (Not dot product), so the output shape should be (#units, input_shape). Since, in my case Input_shape(19 x 21), the output shape by the operation performed in that layer is also (19 x 21). And if the # units is 8, the output should be (8,19,21)

How to do this using Lasagne layers? I checked the Lasagne documentation on how build custom layers, as from link. Following this link, the custom layer is as follows.

class ElementwiseMulLayer(lasagne.layers.Layer):
def __init__(self, incoming, num_units, W=lasagne.init.Normal(0.01),**kwargs):
    super(ElementwiseMulLayer, self).__init__(incoming, **kwargs)
    self.num_inputs = self.input_shape[1]
    self.num_units = num_units
    self.W = self.add_param(W, (self.num_inputs,num_units), name='W')


def get_output_for(self, input, **kwargs):
    #return T.dot(input, self.W)
    result=input*self.W
    return result

def get_output_shape_for(self, input_shape):
    return (input_shape[0], self.num_units,self.num_inputs)  

Here's the NN:

l_in_2 = lasagne.layers.InputLayer(shape=(None, 9*19*21))
l_reshape_l_in_2 = lasagne.layers.ReshapeLayer(l_in_2, (-1, 9,19,21))
l_reshape_l_in_2_EL = lasagne.layers.ExpressionLayer(l_reshape_l_in_2, lambda X: X[:,0,:,:], output_shape='auto') 
l_reshape_l_in_2_EL = lasagne.layers.ReshapeLayer(l_reshape_l_in_2_EL, (-1, 19*21))
l_out1 = ElementwiseMulLayer(l_reshape_l_in_2_EL, num_units=8, name='my_EW_layer')
l_out1 = lasagne.layers.ReshapeLayer(l_out1, (-1, 8*399))
l_out = lasagne.layers.DenseLayer(l_out1,
                                num_units = 19*21,
                                W = lasagne.init.Normal(),
                                nonlinearity = lasagne.nonlinearities.rectify)   

It's worth noting that the batch size is 64. The NN summary:

| Layer | Layer_name                | output_shape         |  # parameters  |
_____________________________________________________________________________
|   0   | InputLayer                | (None, 3591)         |          0     |
|   1   | ReshapeLayer              | (None, 9, 19, 21)    |          0     |
|   2   | ExpressionLayer           | (None, 19, 21)       |          0     |
|   3   | ReshapeLayer              | (None, 399)          |          0     |
|   4   | ElementwiseMulLayer       | (None, 8, 399)       |       3192     |
|   5   | ReshapeLayer              | (None, 3192)         |       3192     |
|   6   | DenseLayer                | (None, 399)          |    1277199     |

Now, when i try to build the NN, I recieved the following error:

ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start at 0) has shape[0] == 399, but the output's size on that axis is 64.
Apply node that caused the error: GpuElemwise{mul,no_inplace}(GpuReshape{2}.0, my_dot_layer.W)
Toposort index: 23
Inputs types: [GpuArrayType<None>(float32, matrix), GpuArrayType<None>(float32, matrix)]
Inputs shapes: [(64, 399), (399, 8)]
Inputs strides: [(14364, 4), (32, 4)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[GpuReshape{2}(GpuElemwise{mul,no_inplace}.0, TensorConstant{[  -1 3192]})]]

I tried to set W as follows:

self.W = self.add_param(W, (self.num_inputs,num_units, self.num_inputs), name='W')

but then again, received a similar error:

ValueError: GpuElemwise. Input dimension mis-match. Input 1 (indices start at 0) has shape[1] == 8, but the output's size on that axis is 64.
Apply node that caused the error: GpuElemwise{mul,no_inplace}(InplaceGpuDimShuffle{x,0,1}.0, my_EW_layer.W)
Toposort index: 26
Inputs types: [GpuArrayType<None>(float32, (True, False, False)), GpuArrayType<None>(float32, 3D)]
Inputs shapes: [(1, 64, 399), (399, 8, 399)]
Inputs strides: [(919296, 14364, 4), (12768, 1596, 4)]
Inputs values: ['not shown', 'not shown']
Outputs clients: [[GpuReshape{2}(GpuElemwise{mul,no_inplace}.0, TensorConstant{[  -1 3192]})]]

I don't have a clear perception how to overcome this issue?



from Building a customized Lasagne layer whose output is a matrix of the elementwise product (input x weight) and not the dot product