Sunday, July 11, 2021

Recent Questions - Stack Overflow

Recent Questions - Stack Overflow


Mongoose creating __v field instead of date field

Posted: 11 Jul 2021 07:52 AM PDT

I have this model:

const mongoose = require("mongoose");  const Schema = mongoose.Schema;    const ModelNameSchema = new Schema({    user: {      type: Schema.Types.ObjectId,      ref: "users",    },    date: {      type: Date,      dafault: Date.now,    },  });    module.exports = ModelName = mongoose.model(    "model_name",    ModelNameSchema  );  

I tried to create documents using this:

const saveDocument = function saveDocument(user_id) {    const document_data = {      user: user_id,    };    const new_document = new ModelName(document_data);    const document = new_document.save();    return document;  };  

For some strange reason, instead of creating the date field, it creates the __v field.
Here is an example of two documents I have created:

[      {          "_id": "60eb01a29e84151343183f4d",          "__v": 0      },      {          "_id": "60eb03ccc465491984b3bf99",          "__v": 0      }  ]  

Any idea what's going on here?

How to print the results of successive conditions so that they follow each other and do not sort them by values?

Posted: 11 Jul 2021 07:52 AM PDT

My code assigns values 1 and 0 in the loop. In the if conditions, it then lists only those where the value 1 is in certain positions. It works as it should.

import os  import numpy as np  from itertools import combinations      a=np.array([0, 0, 0, 1, 1, 1, 2, 2, 2])  b=np.array([0, 1, 2, 0, 1, 2, 0, 1, 2])  z1=np.array([1, 1])  z2=np.array([1, 1])  comb_x=np.array([0, 0, 1, 1])  comb_y=np.array([0, 1, 0, 1])  for (j), (k) in zip(a,b):      #print(j,k)          z1[:]=0      z1[:j]=1      x12=z1      z2[:]=0      z2[:k]=1      y12=z2        #print(x12,y12)                        if x12[0]==1 and y12[0]==1:          print(x12,y12)      if x12[0]==1 and y12[1]==1:          print(x12,y12)      if x12[1]==1 and y12[0]==1:          print(x12,y12)      if x12[1]==1 and y12[1]==1:          print(x12,y12)  

The problem is that it does not list them as the individual conditions follow each other in iterations, but from the number 1 from the least number 1 to the most.

The problem is that it does not list them as the individual conditions follow each other in iterations, but from the number 1 from the least number 1 to the most.

my output:

[1 0] [1 0]  [1 0] [1 1]  [1 0] [1 1]  [1 1] [1 0]  [1 1] [1 0]  [1 1] [1 1]  [1 1] [1 1]  [1 1] [1 1]  [1 1] [1 1]  

When I run the individual conditions one by one, the list of each condition is as follows

if x12[0]==1 and y12[0]==1:          print(x12,y12)      [1 0] [1 0]  [1 0] [1 1]  [1 1] [1 0]  [1 1] [1 1]     if x12[0]==1 and y12[1]==1:          print(x12,y12)    [1 0] [1 1]  [1 1] [1 1]    if x12[1]==1 and y12[0]==1:          print(x12,y12)  [1 1] [1 0]  [1 1] [1 1]    if x12[1]==1 and y12[1]==1:          print(x12,y12)    [1 1] [1 1]  

This is what it should look like even if I run them all at once. These are the associated individual conditions above.

required output

[1 0] [1 0]  [1 0] [1 1]  [1 1] [1 0]  [1 1] [1 1]  [1 0] [1 1]  [1 1] [1 1]  [1 1] [1 0]  [1 1] [1 1]  [1 1] [1 1]  

I also tried to automate it via for loop and it will sort it as well.

for (h),(n),(r) in zip(comb_x,comb_y,np.arange(0,4)):          #print(h,n,'iteracia = ',r)          if x12[h]==1 and y12[n]==1:              print('pravda',x12,y12)  

I don't understand why it sorts me in ascending order by number 1.

Can anyone advise me to do it as the conditions go by and not sort them out?

Docker-compose: how to not expose ports in prod .env?

Posted: 11 Jul 2021 07:52 AM PDT

I need to expose ports in dev environment, and hide them in prod.

docker-compose.yml:

  mongo:      image: mongo      restart: always      volumes:        - db-data:/data/db      networks:        - mongo      ports:        - "${MONGO_PORT}:${MONGO_PORT}"  

Then I've created .env and .env.dev where I define it like "MONGO_PORT=27017". The problem is that I cannot find a way to disable ports exposing in .dev (which supposed to be PROD). If I left it empty ("MONGO_PORT=") compose fail with error

ERROR: The Compose file '././docker-compose.yml' is invalid because: services.mongo.ports contains an invalid type, it should be a number, or an object make: *** [up] Error 1

If I re-define place for ports like

ports: MONGO_PORT, it fails at yml parsing.

Is there a simple way to achieve this, without creating second docker-compose.yml?

dotplot using basic graphic in R (not ggplot)

Posted: 11 Jul 2021 07:52 AM PDT

I have this dataset and I want to draw a dotplot out of it (not with ggplot functions).

    species measurement test  ref         log  1    asinus           1 73.6 76.6 -0.03995205  2    asinus           2 67.1 69.9 -0.04088161  3    asinus           3 24.5 24.8 -0.01217054  4    asinus           4 39.8 41.1 -0.03214121  5    asinus           5 29.0 31.0 -0.06669137  6  caballus           1 79.8 76.6  0.04092643  7  caballus           2 71.7 69.9  0.02542510  8  caballus           3 33.4 24.8  0.29771225  9  caballus           4 52.9 41.1  0.25239522  10 caballus           5 34.7 31.0  0.11275248  11    grevy           1 85.5 76.6  0.10991930  12    grevy           2 78.9 69.9  0.12111558  13    grevy           3 30.2 24.8  0.19699827  14    grevy           4 50.2 41.1  0.20000691  15    grevy           5 35.0 31.0  0.12136086  16    kiang           1 85.0 76.6  0.10405418  17    kiang           2 79.4 69.9  0.12743272  18    kiang           3 25.9 24.8  0.04339932  19    kiang           4 43.7 41.1  0.06133998  20    kiang           5 32.1 31.0  0.03486883  

I used this command:

dotplot (measurement ~ log)  

I have two questions: 1- I want to change the x and y axes means that I want that the code appears in x and log appears in y axis. 2- I want to color code the dots based upon the species column. 3- I want to connect the dots that belong to a single species.

Could anyone help please? Many thanks in advance

React Hook Form with AsyncSelect from React-Select

Posted: 11 Jul 2021 07:52 AM PDT

working on an issue with the react-select AsyncSelect component that loads options from an API.But I can't pass the information to React-Hook Form through the controller.AsyncSelect works perfectly. The data goes back well in my "SelelectedValue" state. Can anyone help me ?

    const [inputValue, setValue] = useState('');      const [selectedValue, setSelectedValue] = useState(null);      // handle input change event      const handleInputChange = value => {          setValue(value);      };      // handle selection      const handleChange = value => {          setSelectedValue(value);      }  
const loadOptions = async (inputValue, callback) => {             const response = await fetch(`APIurl`);          const json = await response.json();          const object = json.records;          callback(object.map(i => ({ label: `${i.fields.firstName} - ${i.fields.lasName} , value: i.fields.firstName })))      }  
<Controller     name="company"     control={control}     rules={{ required: true }}     render={({ field: { onChange, value } }) => (         <AsyncSelect           isClearable           value={selectedValue}           placeholder={'Your information'}           loadOptions={loadOptions}           onInputChange={handleInputChange}           onChange={handleChange}           styles={customStyles}         />)}  />  

Loading images from an AWS bucket media folder to a HTML pdf template

Posted: 11 Jul 2021 07:51 AM PDT

I am trying to generate a PDF file out of a post created by a user, and include an image, which I want to store in an AWS bucket.

This is how it was working with my own file system

My function to generate a pdf in views.py:

def form_render_pdf_view(request, *args, **kwargs):      pk = kwargs.get('pk')      form = get_object_or_404(Post, pk=pk)        template_path = 'form/pdf2.html'      context = {      'form': form  }  # Create a Django response object, and specify content_type as pdf      response = HttpResponse(content_type='application/pdf')      response['Content-Disposition'] = 'filename="report.pdf"'  # find the template and render it.      template = get_template(template_path)      html = template.render(context)    # create a pdf      pisa_status = pisa.CreatePDF(      html, dest=response)        if pisa_status.err:         return HttpResponse('We had some errors <pre>' + html + '</pre>')      return response  

And rendering the image in the html template

    <div>          <img src="{{form.image.path}}">      </div>  

Now after uploading the files in a bucket it doesn't seem to work. The image is being stored there after uploading the post, but when creating the PDF the image is not displayed.(And it gives me an error: This backend doesn't support absolute paths). I tried to change the image source in the HTML template but it still didn't work.

Can MSVC be configured to throw a compiler error when it runs into an undeclared function?

Posted: 11 Jul 2021 07:51 AM PDT

I'm currently using MSVC and Visual Studio. If the compiler runs into an undeclared function, it automatically assumes its form. For example, if I forget to #include <assert.h> it will assume the form

int assert()  

The compiler then proceeds with compilation and outputs a lot "non-sense" errors. This makes it difficult to trace down the actual issue (i.e. a missing #include)

Is it possible to configure the compiler to throw an error if it runs into an undeclared function?

Docker container with Postgres Database fails to build in Azure Devops

Posted: 11 Jul 2021 07:51 AM PDT

I am trying to build a docker container in Azure Devops, that ultimately is deployed as a Lambda Function to AWS. The code is written in F# and the Postgres database provides a schema for the type provider in the F# solution to reference.

I am using the standard docker@2 build step. The docker file below works fine on an OS X system.

The docker file looks something like:

FROM mcr.microsoft.com/dotnet/sdk:5.0 AS base    WORKDIR /src    RUN apt-get update && apt-get install -y postgresql postgresql-common  RUN echo "host all  all    0.0.0.0/0  md5" >> /etc/postgresql/11/main/pg_hba.conf  RUN echo "listen_addresses='*'" >> /etc/postgresql/11/main/postgresql.conf    COPY . .  USER postgres  RUN /etc/init.d/postgresql start &&\      psql --dbname "postgres" < "schema.sql"    USER root    RUN dotnet publish "./PATH_TO_SOLUTION.sln" -c Release -r linux-x64 -o /app/publish    FROM public.ecr.aws/lambda/dotnet:5.0 AS final  WORKDIR /var/task  ENV ASPNETCORE_URLS http://*:5000  COPY --from=base /app/publish .  

The pipeline configuration looks like:

trigger:  - master    resources:  - repo: self    variables:    tag: '$(Build.BuildId)'    stages:  - stage: Build    displayName: Build image    jobs:    - job: Build      displayName: Build      pool:        vmImage: ubuntu-latest      steps:      - task: DockerInstaller@0        inputs:          dockerVersion: '20.10.7'      - task: Docker@2        displayName: Build an image        inputs:          command: build          dockerfile: '$(Build.SourcesDirectory)/Dockerfile'          buildContext: '.'          arguments:  --add-host=host.docker.internal:host-gateway          tags: |            $(tag)    

If I add the step RUN echo $(pg_isready -d postgres -h gateway.docker.internal) to the docker file, I can see that when building locally, the host is available and accepting connections. In DevOps, the host is not available. The only difference I can see would be the DevOps pipeline is using Ubuntu as a base.

How implement a similar behavior of `onMouseUp`, `onMouseEnter`, and `onMouseDown` in Javascript for Jetpack Compose?

Posted: 11 Jul 2021 07:51 AM PDT

I have a 2d grid UI (imagine tiles of a chess board). As an initial state, all tiles are white but when I click a tile, it should change to black.

The experience that I want to implement is if I long click a tile, drag my finger to the right across the other tiles, I want to turn those tiles that I passed through (while holding the first long click) into black.

I have tried pointerInput detectDragGestures or detectTapGestures but it can only provide callbacks to the tile that I started the long click with. It cannot trigger callbacks to the tiles that I passed through while 'dragging' my finger along the other tiles.

In javascript, i believe it can be implemented using the onMouseUp, onMouseEnter, and onMouseDown events.

How to solve [ERROR:flutter/lib/ui/ui_dart_state.cc(186)] Unhandled Exception

Posted: 11 Jul 2021 07:51 AM PDT

I am getting error about [ERROR:flutter/lib/ui/ui_dart_state.cc(186)] Unhandled Exception: Exception: Login failed. Does anyone know how to solve this? Thanks in advance

Here is my code: This is my main.dart. I want to get the error from the response then show the error in the alert dialogue.

Future<void> _submit() async {      try {        final response = await Provider.of<AuthService>(context, listen: false).loginWithGoogle();      }      catch (e) {        showAlertDialog(context, e);      }    }   

Here is my service.dart

Future<dynamic> loginWithGoogle() async {      try {        _authProvider = AuthProvider.Google;          GoogleSignInAccount user = await _googleSignIn.signIn();        GoogleSignInAuthentication googleSignInAuthentication =            await user.authentication;           loginWithMediaSocial(googleSignInAuthentication.accessToken)            .then((value) {          if (value['errorCode']== 500) {            String error = "Login failed.";            throw Exception (error);          }        });      } catch (e) {        print(e);      }    }  

Conversion of a keras sequential model to a soiking neural network model with nengo-dl

Posted: 11 Jul 2021 07:51 AM PDT

i am working on a project for exoplanet detection using Spiking Neural Networks. For this first I have keras sequential model which I want to convert using nengo-dl library.

Data: training data: x_train.shape= (4376, 3197), training data labels: y_train.shape= (4376,) testing data: x_test.shape= (570, 3197), testing data labels: y_test.shape= (570,)

my sequential model:

def cnn_model():    # Data preparation  x_train, y_train, x_test, y_test, n_features = datasets()  x_train, y_train = shuffle(x_train, y_train) # shuffle the data to avoid stagnant 0.0000e+00 val_accuracy    # Architecture  model = Sequential()  model.add(Reshape((3197, 1), input_shape=(3197,)))  model.add(Conv1D(filters=10, kernel_size=2, activation='relu', input_shape=(n_features, 1), kernel_regularizer='l2'))  model.add(MaxPooling1D(pool_size=2, strides=2))  model.add(Dropout(0.2))  model.add(Flatten())  model.add(Dense(48, activation="relu"))  model.add(Dropout(0.4))  model.add(Dense(18, activation="relu"))  model.add(Dense(1, activation="sigmoid"))    # Representation of architecture  print(model.summary())    # Compile model  lr_schedule = ExponentialDecay(initial_learning_rate=1e-2, decay_steps=10000, decay_rate=0.94)    model.compile(optimizer = Adam(learning_rate=lr_schedule), loss='binary_crossentropy', metrics=['accuracy'])    # Fit model  early_stop = EarlyStopping(monitor='val_loss', patience=7, restore_best_weights=True)    history = model.fit(x_train, y_train, validation_split = 0.2, batch_size=64, callbacks=[early_stop], epochs=30, verbose=2)    # Evaluate the model  _, train_acc = model.evaluate(x_train, y_train, verbose=2)  _, test_acc = model.evaluate(x_test, y_test, verbose=2)  

'''

The sequential model is working fine. But when I convert it to SNN using nengo_dl it give error:

'''

from keras import layers, models    input_layer = layers.Input(batch_shape=cnn_model1.layers[0].input_shape)  prev_layer = input_layer  for layer in cnn_model1.layers:      prev_layer = layer(prev_layer)    funcmodel = models.Model([input_layer], [prev_layer])    #!pip install nengo-dl  import nengo  import nengo_dl    import tensorflow as tf  #print(cnn_model1.summary())    converter = nengo_dl.Converter(      funcmodel,      swap_activations={          tf.keras.activations.relu: nengo.SpikingRectifiedLinear()},      scale_firing_rates=sfr,      synapse=0.005,      inference_only=True)      #Adding timesteps to the dataset:    x_train = x_train.reshape((x_train.shape[0], 1, -1))  y_train = y_train.reshape((y_train.shape[0],-1))    x_test = x_test.reshape((x_test.shape[0], 1, -1))  y_test = y_test.reshape((y_test.shape[0],-1))      do_training = True  if do_training:      with nengo_dl.Simulator(converter.net, minibatch_size=200) as sim:          # run training          sim.compile(              optimizer=tf.optimizers.Adam(0.001),              loss=tf.losses.SparseCategoricalCrossentropy(from_logits=True),              metrics=[tf.metrics.sparse_categorical_accuracy],          )          sim.fit(              {converter.inputs[converter.model.input]: x_train},              {converter.outputs[converter.model.output]: y_train},              validation_data=(                  {converter.inputs[converter.model.input]: x_test},                  {converter.outputs[converter.model.output]: y_test},              ),              epochs=3,          )            # save the parameters to file          sim.save_params("./keras_to_snn_params")  

errors: '''

ValidationError                           Traceback (most recent call last)  <ipython-input-25-71005d854e2d> in <module>       17                 {converter.outputs[converter.model.output]: y_test},       18             ),  ---> 19             epochs=1,       20         )       21     /opt/conda/lib/python3.7/site-packages/nengo/utils/magic.py in __call__(self, *args, **kwargs)      179                 return self.wrapper(wrapped, instance, args, kwargs)      180             else:  --> 181                 return self.wrapper(self.__wrapped__, self.instance, args, kwargs)      182         else:      183             instance = getattr(self.__wrapped__, "__self__", None)    /opt/conda/lib/python3.7/site-packages/nengo_dl/simulator.py in require_open(wrapped, instance, args, kwargs)       65         )       66   ---> 67     return wrapped(*args, **kwargs)       68        69     /opt/conda/lib/python3.7/site-packages/nengo_dl/simulator.py in fit(self, x, y, n_steps, stateful, **kwargs)      858             y_val = validation_data[1]      859             y_val = self._standardize_data(y_val, self.model.probes)  --> 860             self._check_data(y_val, n_steps=None, nodes=False)      861       862             if len(validation_data) == 2:    /opt/conda/lib/python3.7/site-packages/nengo_dl/simulator.py in _check_data(self, data, batch_size, n_steps, nodes)     1959                     f"should have rank 3 (batch_size, n_steps, dimensions), found rank "     1960                     f"{len(x.shape)}",  -> 1961                     f"{name} data",     1962                 )     1963             if x.shape[0] < self.minibatch_size:    ValidationError: probe data: should have rank 3 (batch_size, n_steps, dimensions), found rank 2  

Can anyone help in getting this right.

SPFx Fast server error on adding new class

Posted: 11 Jul 2021 07:51 AM PDT

I have started using SPFx Fast Serve to enable serving the changes fast in my SPFx web part. However it is throwing error when I add new class to scss file and use that in my react component.

Below is the error message: Also I have attached the error screenshot. Property 'class name' does not exists on type...

enter image description here

If I stop the npm run serve and then runs it again, then it works fine. But it is slowing my development exeeprience for every class that I add.

When I installed this yesterday it was working fine with addition of new class but today it is giving me this. I haven't done any changes to dev. environment since yesterday.

Please let me know if there is anything I can do. Thank you!

Redshift - SQL - ratio of approved students

Posted: 11 Jul 2021 07:51 AM PDT

Hi I have a table in Redshift with few students transcript records as below. I would like to work out what the approval ratio is for each university (uni1, uni2 and uni3). A student is approved when their grade is either greater than D+ or greater than 50:

enter image description here

I am now trying to work out the ratio by university using the query below:

select 'uni1' as uni,  count (case when (grade > 50 and grade not in ('D+', 'D', 'D-', 'F')) then 1 end) as passed  ,count(*) as total  ,(passed / total)*100 as ratio  from analytics.students_fact  where uni= 'uni1' and grade <> 'Unknown';      union    select 'uni2' as uni,  count (case when (grade > 50 and grade not in ('D+', 'D', 'D-', 'F')) then 1 end) as passed  ,count(*) as total  ,(passed / total)*100 as ratio  from analytics.students_fact  where uni= 'uni2' and grade <> 'Unknown';    union    select 'uni3' as uni,  count (case when (grade > 50 and grade not in ('D+', 'D', 'D-', 'F')) then 1 end) as passed  ,count(*) as total  ,(passed / total)*100 as ratio  from analytics.students_fact  where uni= 'uni3' and grade <> 'Unknown';  

I should also cast all figures from the field "grade" to integer before I do any comparison but we have alphabetical characters in there as well. How should I go about casting to integer only the records from field "grade" that range from 0 - 100 and ensure I am covering all students?

Can we create an installation respose file i.e., .iss file for every exe or are there any constraints for this?

Posted: 11 Jul 2021 07:50 AM PDT

Can we create an installation response file which is a setup.iss file for every setup.exe or are there any constraints for this?

Setting up and running docker images: basic questions

Posted: 11 Jul 2021 07:50 AM PDT

I'm a bit confused as to how to go ahead with docker, sorry if these are simple questions I'm now getting started. Here is the general way I get started, I'd appreciate if you could let me know if I'm making it more complicated than I need to.

I can build an image with the following Dockerfile:

FROM condaforge/mambaforge:4.10.1-0    # Use bash as shell  SHELL ["/bin/bash", "-c"]    # Set working directory  WORKDIR /work_dir    # Install vim  RUN ["apt-get", "update"]  RUN ["apt-get", "install", "-y", "vim"]    # Start Bash shell by default  CMD /bin/bash  

I build it with docker build --rm . -t some_docker but then I'd like to enter the container, and install things individually interactively, so that later on I can export the whole image with all additional installations. So I then can start it interactively with docker run -it some_docker, after which I do my things. I would then like to export it.

So here are my specific questions:

  • Is there an easier way to build (and keep) the image available so that then I can come back to it at another point? When I run docker ps -a I see so many images that I dont know what they do since many of them dont have any tag.
  • After building I get the warning Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them. Is this a problem and if so, how to solve it?
  • How can I specify in my Dockerfile (or docker build?) that ports for rstudio should be open? I saw that docker-compose allows you to specify ports: 8787:8787, how do I do it in here?

Are there ways to make my JS code shorter?

Posted: 11 Jul 2021 07:52 AM PDT

Just out of curiosity, I want to know if there is a way to shorten this code. Every idea is accepted.

const prefix = 'b!',  command = me.content.replace(RegExp(prefix + '\\s'), prefix)   .slice(prefix.length).split(/\s/)[0],  content = me.content.replace(RegExp(prefix + '\\s'), prefix)   .slice(prefix.length + command.length).trim(),  args = content.split(/\s/);  

me.content.replace(RegExp(prefix + '\s'), prefix) is repeated in my code, is there a way to shorten it without creating another variable?

// me.content example:  me.content = 'b! ban @SomeUser for breaking the rules'  

Using "IndexOf" on a List<T> of objects

Posted: 11 Jul 2021 07:52 AM PDT

In my Visual Studio2019 C# console program (named 'codeTester') I have this object:

        public class ipData          {              private string ip;              private string region;              private string country;                public ipData(string ip, string region, string country)              {                  this.ip = ip;                  this.region = region;                  this.country = country;              }                                    public string Ip              {                  get { return ip; }                  set { ip = value; }              }                public string Region              {                  get { return region; }                  set { region = value; }              }              public string Country              {                  get { return country; }                  set { country = value; }              }                        }  

and I created a List of this object and add some data:

            List<ipData> ipInfo = new List<ipData>();              ipInfo.Add(new ipData("192.168.0.199", "UT", "USA"));              ipInfo.Add(new ipData("251.168.0.963", "NB", "CAN"));  

Now I want to search the list on one of its fields so I ask the user for the data to search for:

            Console.WriteLine("Enter searh criteria: ");              string searchparam = Console.ReadLine();  

Next I want the index of the found item, if any:

int x = ipInfo.IndexOf(searchparam);  

but this statement throws a design-time exception which says:

"Argument 1: cannot convert from 'string' to 'codeTester.Program.ipData'"

So I've been stuck at this point for hours and all my searches have not yielded anything pertinent. Where am I going wrong?

Rock Paper Scissors button issue

Posted: 11 Jul 2021 07:51 AM PDT

I built a very basic rock paper scissors game but it seems that somehow the function is storing "rock" as the "userSelection" variable no matter which button is clicked.

I tried removing the "=true" part entirely, didn't work. I tried changing it to == or === instead of = ; didn't help.

I've spent hours researching various aspects of my code but can't figure it out.

https://codepen.io/anna_whiskey/pen/YzVGBYN

const gameInputs = ['rock', 'paper', 'scissors'];  let computerSel;  let round1Answer;  let userSelection;      function game () {    function humanPlay () {        if (document.getElementById("btn1").clicked = true) {          userSelection = 'rock';          console.log(userSelection);      }      else if (document.getElementById("btn2").clicked = true) {          userSelection = 'paper';          console.log(userSelection);                    }        else if (document.getElementById("btn3").clicked = true) {              userSelection = 'scissors';              console.log(userSelection);                            }            }    humanPlay();        function computerPlay() {    computerSel = Math.floor(Math.random() * gameInputs.length);  round1Answer = (gameInputs[computerSel]);    }  computerPlay();      function playRound (round1Answer, userSelection) {    if (userSelection === 'rock' && round1Answer === 'scissors') {  alert ('You WIN!');  } else if (userSelection === 'rock' && round1Answer === 'rock') {  alert ('It/s a tie!');  } else if (userSelection === 'paper' && round1Answer === 'rock') {  alert ('You WIN!');  } else if (userSelection === 'paper' && round1Answer === 'paper') {  alert ('It/s a tie!');  } else if (userSelection === 'scissors' && round1Answer === 'paper') {  alert ('You WIN!');  } else if (userSelection === 'scissors' && round1Answer === 'scissors') {  alert ('It/s a tie!');  } else {  alert ('You LOSE!');  }  }  playRound (round1Answer, userSelection);  }      <!DOCTYPE html>  <html>  <head>  <meta charset="utf-8">    <title>Rock Paper Scissors</title>  <body>  <div class="bg"></div>      <button id="btn1" onclick="game()">Rock</button>   <button id="btn2" onclick="game()">Paper</button>    <button id="btn3" onclick="game()">Scissors</button>         <link rel="stylesheet" href="rps.css">  <script type="text/javascript" src="rps.js"></script>    </body>  </html>  

what is require to deploy next js without vercel

Posted: 11 Jul 2021 07:50 AM PDT

i need to deploy next js app but not in vervel what the steps should I do to deploy it I used to drag and drop build folder from react to server but in next I didn't know which files I should deploy and how


note => I am using getServerSideProps function so I think next export will not work with it

Number of columns not matching

Posted: 11 Jul 2021 07:52 AM PDT

I'm trying to use the rbind function to create some data for a matching process, but I'm getting this error:

Error in rbind(deparse.level, ...) :     numbers of columns of arguments do not match  

I've checked and adjusted the order so they definitely match, but still getting the error. Any idea why?

This is my code:

# Match on the data from the year before treatment, matching on counttotal and countbrown  matchData <-    rbind(treat_firms_1year_prior[, -c(      grep("year_int_tx", colnames(treat_firms_1year_prior)),      grep("matchingyear", colnames(treat_firms_1year_prior)),      grep("flag", colnames(treat_firms_1year_prior))    )],    control_firms_year_int_tx)  

These are the column names:

> colnames(treat_firms_1year_prior)   [1] "investor"            "dealyear"            "totalUSD"            "counttotal"          "greenUSD"            "countgreen"            [7] "brownUSD"            "countbrown"          "signatory"           "treatment"           "firsttreat"          "matchingyear"         [13] "country"             "region"              "yearest"             "strategy"            "capsources"          "historicfunds"        [19] "eligible_treat_firm" "year_int_tx"         "flag"                 
> colnames(control_firms_year_int_tx)   [1] "investor"            "dealyear"            "totalUSD"            "counttotal"          "greenUSD"            "countgreen"            [7] "brownUSD"            "countbrown"          "signatory"           "treatment"           "firsttreat"          "matchingyear"         [13] "country"             "region"              "yearest"             "strategy"            "capsources"          "historicfunds"        [19] "eligible_treat_firm" "year_int_tx"         "flag"    

Any tips would be great!

How can I extract features from audio to dataset?

Posted: 11 Jul 2021 07:52 AM PDT

I have a folder stored on the my desktop, and its contains 187 audio file in WAV format. I want to extract features from audio files and I executed the following code for extract features audio and to save the features in CSV file, but I obtained excel file with no value, its include only the title for each column and the output for len(audio-files) was 0..but its must be 187. How can I solve this problem???

  from glob import glob      data_dir = './audio featur-extraction\audio-setA/'      audio_files = glob(data_dir + '*.wav')      len(audio_files)                         from librosa import feature      import numpy as np            fn_list_i = [          feature.chroma_stft,          feature.spectral_centroid,          feature.spectral_bandwidth,          feature.spectral_rolloff,              ]              fn_list_ii = [                                      feature.zero_crossing_rate                    ]                    def get_feature_vector(y,sr):          feat_vect_i = [ np.mean(funct(y,sr)) for funct in fn_list_i]        feat_vect_ii = [ np.mean(funct(y)) for funct in fn_list_ii]                feature_vector =   feat_vect_i + feat_vect_ii          return feature_vector            #build the matrix with normal audios featurized      audios_feat = []      for file in audio_files:        '''        y is the time series array of the audio file, a 1D np.ndarray        sr is the sampling rate, a number        '''          y,sr = librosa.load(file,sr=None)           feature_vector = get_feature_vector(y, sr)         audios_feat.append(feature_vector)          print('.', end= " ")                print(audios_feat)        #.........................      import csv            norm_output = 'normals_00.csv'                  header =[          'chroma_stft',          'spectral_centroid',          'spectral_bandwidth',          'spectral_rolloff',          'zero_crossing_rate',                ]            #WARNING : this overwrites the file each time. Be aware of this because feature extraction step takes time.      with open(norm_output,'+w') as f:        csv_writer = csv.writer(f, delimiter = ',')        csv_writer.writerow(header)        csv_writer.writerows(audios_feat)  

Array of Objects in C++

Posted: 11 Jul 2021 07:52 AM PDT

i am trying to do chess with openGL c ++, for now i have created the pawn object, where its constructor takes an unsigned int parameter. So I tried to create an array of these pawns, and the only working way I've found to do this is this :

Pawn *pawn[n];    for (int i = 0; i < n; i++) {      pawn[i] = new Pawn(Unsigned int var);  }  

To call a function of pawn [0], for example, I have to do this :

pawn[0]->function(parameters);  

This is the Pawn class :

class Pawn  {  private:        float vertices [16] = {          //position       //text coord          -0.08f, -0.10f,   0.0f, 0.0f,           0.08f, -0.10f,   1.0f, 0.0f,           0.08f,  0.10f,   1.0f, 1.0f,          -0.08f,  0.10f,   0.0f, 1.0f        };        GLuint indices[6] {          0, 1, 2,          0, 2, 3      };        unsigned int shaderID, VBO, VAO, EBO, texture;    public:        glm::vec2 Position = glm::vec2(0.0f, 0.0f);        Pawn () {}        Pawn(GLuint shaderID) {          ...      }            ~Pawn() {           ...      }        void setTexture();        void draw (glm::vec2 position);    };  

I also tried this :

Pawn pawn[8];    for (int i = 0; i < 8; i++) {      pawn[i] = Pawn(shaderID);  }  

but when i run it doesn't work.

I was wondering if this method is efficient or not, and if so, why it works, since I didn't understand it. Thanks for your help

word counter with regex in powershell

Posted: 11 Jul 2021 07:52 AM PDT

var words = "word worddd woord wooord 45555";    var wordCount = words.match(/([a-zA-Z]\w+)/g).length;    if(wordCount == 4 || wordCount == 6 ){    WScript.Echo(wordCount);//Result 4  }  

How to make a ps1 script that work like like jscript

How can I determine the complexity of this small program? [duplicate]

Posted: 11 Jul 2021 07:52 AM PDT

package maguen_3;  import java.util.Arrays;    public class Maguen_3_test {        public static void main(String[] args) {          randomPLaylist(10);        }      public static void randomPLaylist(int num) {          int[] songsArray = new int[num];          for(int i = 0; i < num; i++) {              songsArray[i] = (int)(Math.random()*num) + 1;              for(int j = 0; j < i; j++) {                  if(songsArray[i] == songsArray[j]){                      i--;                      break;                  }              }          }          System.out.println(Arrays.toString(songsArray));      }    }  

This is a small program that shuffles a playlist. The function randomPlaylist receives the length of the playlist. I know intuitively that it is O(n * log n), but I don't know how to explain it with words.

Writing tests for Flux.handle().onErrorContinue()

Posted: 11 Jul 2021 07:51 AM PDT

I am trying to write tests for a Flux that uses onErrorContinue after a handle(), however I haven't found any documentation on how to do this. At this point I've used StepVerifier to handle other testing aspects. However I want to generate an exception and test that it is handled properly. Here is a sample of the code:

 public void handleEvent(Flux<Message<String>> messageFlux) {      messageFlux          .filter(cloudEventValidator::isValidCloudEvent)          .handle((stringMessage, sink)-> {              List<Endpoint> endpoints = getEndpoints(stringMessage);              endpoints.forEach(endpoint -> deliveryToEndpoint(endpoint, stringMessage));          })          .onErrorContinue((e,object) -> {log.error(e.getClass().toString()+" "+e.getMessage());})          .subscribe();  }    

Any help would be appreciated.

How to create the dynamic menu in WordPress form the existing HTML Code. Dynamic menu is created but it is not properly alligned

Posted: 11 Jul 2021 07:52 AM PDT

<ul class="nav-menu">       <li class="nav-item"><a href="inner.html">Home</a></li>       <li class="nav-item"> <a href="inner.html">About Us</a>          <div class="sub-nav">              <ul class="sub-nav-group">                 <li><a href="inner.html">Profile </a></li>                 <li><a href="inner.html">Mission &#47; Vision </a></li>              </ul>                      </div>       </li>                   <li class="nav-item"> <a href="inner.html">  Notices </a>          <div class="sub-nav">              <ul class="sub-nav-group">                 <li><a href="inner.html">Announcements</a></li>                 <li><a href="inner.html">Recruitments</a></li>              </ul>          </div>       </li>  </ul>  

Above is my HTML code, a menu with sub menu. How do I align the dynamic menu?

Firestore: Can we not use orderBy on a field filtered with whereIn?

Posted: 11 Jul 2021 07:50 AM PDT

I get this error when tring to orderBy a field I have filtered with whereIn(field,Array):

FirebaseFirestoreException: INVALID_ARGUMENT: Order by clause cannot contain a field with an equality filter

But there is no equality sign; there rather an OR between equalities, like field="xx" OR field="yy" OR field="zz".

Which makes it perfectly reasonable to want to order by the field.

Is it really not possible to use orderBy with whereIn on the same field??

Execution failed for task ':app:transformClassesAndResourcesWithR8ForRelease' while generating signed apk/bundle

Posted: 11 Jul 2021 07:52 AM PDT

The program compiles fine and I can generate unsigned apk. But "Execution failed for task ':app:transformClassesAndResourcesWithR8ForRelease'"while generating signed apk

I have already set multiDexEnabled = true and android.enableR8 = true. Even android.enableR8 = false did not work.

org.gradle.execution.MultipleBuildFailures: Build completed with 1 failures.      at org.gradle.initialization.DefaultGradleLauncher$ExecuteTasks.run(DefaultGradleLauncher.java:386)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:301)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:293)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:91)      at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)      at org.gradle.initialization.DefaultGradleLauncher.runTasks(DefaultGradleLauncher.java:247)      at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:159)      at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:134)      at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:58)      at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:55)      at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:82)      at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:75)      at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:183)      at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:40)      at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:75)      at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:55)      at org.gradle.tooling.internal.provider.runner.BuildModelActionRunner.run(BuildModelActionRunner.java:54)      at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)      at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)      at org.gradle.launcher.exec.BuildOutcomeReportingBuildActionRunner.run(BuildOutcomeReportingBuildActionRunner.java:58)      at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)      at org.gradle.launcher.exec.BuildCompletionNotifyingBuildActionRunner.run(BuildCompletionNotifyingBuildActionRunner.java:39)      at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:49)      at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:44)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:315)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:305)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:101)      at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)      at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:44)      at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:49)      at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:46)      at org.gradle.composite.internal.DefaultRootBuildState.run(DefaultRootBuildState.java:78)      at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:46)      at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)      at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:42)      at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:28)      at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:78)      at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:52)      at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:59)      at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:36)      at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:68)      at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:38)      at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:37)      at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:26)      at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:43)      at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:29)      at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:60)      at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:32)      at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:55)      at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:41)      at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:48)      at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:32)      at org.gradle.launcher.daemon.server.exec.ExecuteBuild.doBuild(ExecuteBuild.java:67)      at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.WatchForDisconnection.execute(WatchForDisconnection.java:37)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.ResetDeprecationLogger.execute(ResetDeprecationLogger.java:26)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.RequestStopIfSingleUsedDaemon.execute(RequestStopIfSingleUsedDaemon.java:34)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.call(ForwardClientInput.java:74)      at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.call(ForwardClientInput.java:72)      at org.gradle.util.Swapper.swap(Swapper.java:38)      at org.gradle.launcher.daemon.server.exec.ForwardClientInput.execute(ForwardClientInput.java:72)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.LogAndCheckHealth.execute(LogAndCheckHealth.java:55)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.LogToClient.doBuild(LogToClient.java:62)      at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment.doBuild(EstablishBuildEnvironment.java:81)      at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)      at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)      at org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy$1.run(StartBuildOrRespondWithBusy.java:50)      at org.gradle.launcher.daemon.server.DaemonStateCoordinator$1.run(DaemonStateCoordinator.java:295)      at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)      at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)      at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)      at java.lang.Thread.run(Thread.java:745)  Caused by: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':app:transformClassesAndResourcesWithR8ForRelease'.      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:95)      at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)      at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:57)      at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:119)      at org.gradle.api.internal.tasks.execution.ResolvePreviousStateExecuter.execute(ResolvePreviousStateExecuter.java:43)      at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)      at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)      at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:94)      at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:56)      at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:55)      at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)      at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:67)      at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)      at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:315)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:305)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:101)      at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)      at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:49)      at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)      at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)      at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)      at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)      at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)      at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)      at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)      at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)      at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)      at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)      ... 6 more  Caused by: java.lang.NullPointerException      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:598)      at java.util.concurrent.ForkJoinTask.get(ForkJoinTask.java:1005)      at com.android.tools.r8.utils.ThreadUtils.awaitFutures(ThreadUtils.java:21)      at com.android.tools.r8.ir.conversion.CallGraph.forEachMethod(CallGraph.java:472)      at com.android.tools.r8.ir.conversion.IRConverter.optimize(IRConverter.java:549)      at com.android.tools.r8.R8.run(R8.java:508)      at com.android.tools.r8.R8.run(R8.java:251)      at com.android.tools.r8.R8.lambda$runForTesting$1(R8.java:242)      at com.android.tools.r8.utils.ExceptionUtils.withCompilationHandler(ExceptionUtils.java:62)      at com.android.tools.r8.utils.ExceptionUtils.withR8CompilationHandler(ExceptionUtils.java:50)      at com.android.tools.r8.R8.runForTesting(R8.java:238)      at com.android.tools.r8.R8.run(R8.java:148)      at com.android.builder.dexing.R8Tool.runR8(r8Tool.kt:189)      at com.android.build.gradle.internal.transforms.R8Transform.transform(R8Transform.kt:260)      at com.android.build.gradle.internal.pipeline.TransformTask$2.call(TransformTask.java:239)      at com.android.build.gradle.internal.pipeline.TransformTask$2.call(TransformTask.java:235)      at com.android.builder.profile.ThreadRecorder.record(ThreadRecorder.java:102)      at com.android.build.gradle.internal.pipeline.TransformTask.transform(TransformTask.java:230)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)      at org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:47)      at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)      at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.run(ExecuteActionsTaskExecuter.java:284)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:301)      at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:293)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)      at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:91)      at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:273)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:258)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:67)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:145)      at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:49)      at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)      at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)      at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)      at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:33)      at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:50)      at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:43)      at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:29)      at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:134)      at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$3(CacheStep.java:83)      at java.util.Optional.orElseGet(Optional.java:267)      at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)      at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:36)      at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)      at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)      at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)      at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:96)      at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:89)      at java.util.Optional.map(Optional.java:215)      at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52)      at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)      at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)      at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:91)      ... 35 more      Suppressed: java.util.concurrent.ExecutionException: java.lang.NullPointerException          at java.util.concurrent.ForkJoinTask.get(ForkJoinTask.java:1006)          ... 92 more      [CIRCULAR REFERENCE:java.lang.NullPointerException]  Caused by: java.lang.NullPointerException      at com.android.tools.r8.ir.optimize.CodeRewriter.workaroundExceptionTargetingLoopHeaderBug(CodeRewriter.java:3839)      at com.android.tools.r8.ir.conversion.IRConverter.performRegisterAllocation(IRConverter.java:1251)      at com.android.tools.r8.ir.conversion.IRConverter.finalizeToDex(IRConverter.java:1207)      at com.android.tools.r8.ir.conversion.IRConverter.finalizeIR(IRConverter.java:1181)      at com.android.tools.r8.ir.conversion.IRConverter.rewriteCode(IRConverter.java:1122)      at com.android.tools.r8.ir.conversion.IRConverter.processMethod(IRConverter.java:805)      at com.android.tools.r8.ir.conversion.IRConverter.lambda$optimize$5(IRConverter.java:551)      at com.android.tools.r8.ir.conversion.CallGraph.lambda$forEachMethod$6(CallGraph.java:468)      at java.util.concurrent.ForkJoinTask$AdaptedCallable.exec(ForkJoinTask.java:1424)      at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)      at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)      at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)      at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)      

Here is my build.gradle (Module: app)

apply plugin: 'com.android.application'  apply plugin: 'com.google.gms.google-services'    android {      compileSdkVersion 28      buildToolsVersion '28.0.3'      defaultConfig {          applicationId "com.microsoft.abiram.gkm"          minSdkVersion 18          targetSdkVersion 28          versionCode 26          versionName "1.7"          multiDexEnabled true          vectorDrawables.useSupportLibrary = true          testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"      }      buildTypes {          release {              minifyEnabled true              proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'          }      }  }    dependencies {      implementation fileTree(include: ['*.jar'], dir: 'libs')      implementation 'androidx.appcompat:appcompat:1.0.2'        implementation 'androidx.constraintlayout:constraintlayout:1.1.3'      testImplementation 'junit:junit:4.12'      androidTestImplementation 'androidx.test:runner:1.1.1'      androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'      implementation 'com.google.firebase:firebase-core:16.0.8'      implementation 'io.branch.sdk.android:library:3.1.2'      implementation 'com.google.firebase:firebase-auth:16.2.1'      implementation 'com.google.android.gms:play-services-auth:16.0.1'      implementation 'pl.droidsonroids.gif:android-gif-drawable:1.2.16'      implementation 'com.google.firebase:firebase-config:16.5.0'      implementation 'com.google.firebase:firebase-messaging:17.6.0'      implementation 'com.google.android.material:material:1.1.0-alpha05'      implementation 'com.github.paolorotolo:expandableheightlistview:1.0.0'      implementation 'com.github.GrenderG:Toasty:1.4.1'      implementation 'com.getkeepsafe.taptargetview:taptargetview:1.12.0'      implementation 'com.github.AppIntro:AppIntro:v5.1.0'      implementation 'com.tapadoo.android:alerter:4.0.2'      implementation 'androidx.cardview:cardview:1.0.0'      implementation 'com.gdacciaro:iosdialog:1.0.3'    }  

build.grade (Project)

// Top-level build file where you can add configuration options common to all sub-projects/modules.    buildscript {        repositories {          google()          jcenter()          mavenCentral()      }      dependencies {          classpath 'com.android.tools.build:gradle:3.4.0'              // NOTE: Do not place your application dependencies here; they belong          // in the individual module build.gradle files          classpath 'com.google.gms:google-services:4.2.0'      }  }    allprojects {      repositories {          google()          jcenter()          mavenCentral()          maven { url "https://jitpack.io" }      }  }    task clean(type: Delete) {      delete rootProject.buildDir  }    subprojects          {              project.configurations.all {                  resolutionStrategy.eachDependency { details ->                      if (details.requested.group == 'com.android.support'                              && !details.requested.name.contains('multidex') ) {                          details.useVersion "26.1.0"                      }                  }              }          }  

gradle.properties

# Project-wide Gradle settings.  # IDE (e.g. Android Studio) users:  # Gradle settings configured through the IDE *will override*  # any settings specified in this file.  # For more details on how to configure your build environment visit  # http://www.gradle.org/docs/current/userguide/build_environment.html  # Specifies the JVM arguments used for the daemon process.  # The setting is particularly useful for tweaking memory settings.  org.gradle.jvmargs=-Xmx1536m  android.useAndroidX=true  android.enableJetifier=true  android.enableR8 = true  multiDexEnabled = true  # When configured, Gradle will run in incubating parallel mode.  # This option should only be used with decoupled projects. More details, visit  # http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects  # org.gradle.parallel=true    

Calculating an elapsed Time

Posted: 11 Jul 2021 07:51 AM PDT

I want to calculate the elapsed time which a process needs to execute based on 2 strings with timestamps in the format HH:mm:ss:ff. Therefore I splitted those strings, turned them into an integer and subtracted them.

What I tried is to subtract the last timestamp from the first. It also works sometimes. But I also get a lot of weird feedback out of this - for example: 0:0:-3:-18 I think this is the result of not handling the case if a value is higher than another and they get divided.

Here is the function I use to subtract the strings:

        static string calculateElapsedTime(string startTime, string endTime)      {          try          {              string[] startTimeSplit = startZeit.Split(new char[] { ':', '.' });              string[] endTimeSplit = endZeit.Split(new char[] { ':', '.' });              int[] elapsedTime = new int[4];                endTimeSplit[0] = Convert.ToInt32(endTimeSplit[0]) - Convert.ToInt32(startTimeSplit[0]);              endTimeSplit[1] = Convert.ToInt32(endTimeSplit[1]) - Convert.ToInt32(startTimeSplit[1]);              endTimeSplit[2] = Convert.ToInt32(endTimeSplit[2]) - Convert.ToInt32(startTimeSplit[2]);              endTimeSplit[3] = Convert.ToInt32(endTimeSplit[3]) - Convert.ToInt32(startTimeSplit[3]);                string elapsedTimeString = string.Format("{0}:{1}:{2}:{3}", endTimeSplit[0], endTimeSplit[1], endTimeSplit[2], endTimeSplit[3]);                return elapsedTimeString;          }          catch( Exception ex )          {              Console.WriteLine(ex.Message);              return "null";          }      }  

And I got the value for the parameters by simply getting the time like:

DateTime.Now.ToString("HH:mm:ss:ff", System.Globalization.DateTimeFormatInfo.InvariantInfo);  

SOLUTION:

There is a Function called Stopwatch in the Namespace System.Diagnostics.

You can use it as following:

Stopwatch watch = new Stopwatch();  watch.Start();  //Prozess  watch.Stop();  Console.WriteLine(watch.Elapsed);  

Understanding Knuth-Morris-Pratt Algorithm

Posted: 11 Jul 2021 07:51 AM PDT

Can someone explain this to me? I've been reading about it and it still is hard to follow.

text : ababdbaababa
pattern: ababa

table for ababa is -1 0 0 1 2.

I think I understand how the table is constructed but, I dont understand how to shift once mismatch has occurred. Seems like we dont even use the table when shifting?

when do we use the table?

No comments:

Post a Comment