Friday, June 10, 2022

Recent Questions - Stack Overflow

Recent Questions - Stack Overflow


Azure App Services Reloading site instead of css files

Posted: 10 Jun 2022 07:49 AM PDT

Ok odd one. I have deployed an example angular application to Azure App Services. This has worked.. but the styles are failing to load, and on checking

https://angularoptimized.azurewebsites.net/styles.7ade3819d0791222.css  

This fails to load, it is redirecting to the main site.. not failing? I can confirm the file exists. And in case it matters, there are also compressed versions available on the server so wondering if this is being interpreted weirdly?? If checked in a browser, its not identifying the call as a style sheet either. Any ideas?

How to add the current iteration's value to the previous iteration's value in a loop C#

Posted: 10 Jun 2022 07:49 AM PDT

I can find a Java version for my question, but not C#. My current attempt goes crazy when attempting to add them. I feel like there is a simple fix, but I'm struggling to think of it.

// See https://aka.ms/new-console-template for more information  int[] input = { 28, 2, 3, -3, -2, 1, 2, 35, -1, 0, 0, -1 };  for (int i = 0; i < input.Length; i++)  {      int x = input[i];      int y = input[i++];      int output = x + y;      Console.WriteLine(output);  }  

Can i put a box in half over another box (without px)?

Posted: 10 Jun 2022 07:49 AM PDT

Is there an optimal solution for my problem? I have a green box with position: relative; and a white box with position: absolute;

The absolute box has top: 0; I only wish that the white box is exactly centered (with one half over the green box and the other half outside). Is there a solution for this in CSS, regardless of the height of the box in px?

Thank you in advance!

enter image description here

Blazor not show Not Found page for files endpoint

Posted: 10 Jun 2022 07:48 AM PDT

For some kind of reasons, Blazor not show the "Not Found" page, created in App.razor, when users type on their browser a endpoint like a files (example: localhost:5001/unexist.file)

Example when endpoint contains points enter image description here

App.razor

<CascadingAuthenticationState>    <Router AppAssembly="@typeof(Program).Assembly" >      ....      <NotFound>         <LayoutView Layout="@typeof(MainLayout)">         ...         My custom Not Found page content         ...         </LayoutView>      </NotFound>    </Router>    <NavigationTracker />  </CascadingAuthenticationState>  

This work correctly only if endpoint doesn't contains any point characters. There is a possibility to control this?

Converting entity class to record how can I do it in the right way

Posted: 10 Jun 2022 07:48 AM PDT

I want to convert this class to record

    public partial class PathCategory      {          public PathCategory()          {              DocumentPaths = new HashSet<DocumentPath>();          }            public Guid Id { get; set; }          public string Category { get; set; } = null!;            public virtual ICollection<DocumentPath> DocumentPaths { get; set; }      }  

so all I do is replace partial class with record to be like this

public record PathCategory  

Is that it, or are there any best practices I should follow?

ERROR: Template does not exist / Used: Django + React + heroku

Posted: 10 Jun 2022 07:48 AM PDT

I created a simple note with django & react app. And I am trying to deploy to Heroku.

I tried heroku open at terminal, but this error page came out. enter image description here

Here is my environment and things i've tried for 2 days.


File structure)

project-folder     api      env      frontend        (my React app)        build           index.html        public        src        package.json     mynote          (my Django main app)        settings.py     manage.py     Procfile     requirements.txt     runtime.txt  

requirements.txt)

asgiref==3.5.2  dj-database-url==0.5.0  Django==4.0.5  django-cors-headers==3.13.0  djangorestframework==3.13.1  gunicorn==20.1.0  psycopg2-binary==2.9.3  pytz==2022.1  sqlparse==0.4.2  tzdata==2022.1  whitenoise==6.2.0  

runtime.txt)

python-3.10.0  

Procfile)

web: gunicorn mynotes.wsgi --log-file -  

settings.py)

"""  Django settings for mynotes project.  """    from pathlib import Path  import os      BASE_DIR = Path(__file__).resolve().parent.parent  #BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))      # for heroku deploy  SECRET_KEY = os.environ.get('DJANGO_SECRET_KEY', '~')    # for heroku deploy  DEBUG = bool( os.environ.get('DJANGO_DEBUG', True) )    ALLOWED_HOSTS = [      '[here is my project name!].herokuapp.com',      '127.0.0.1'  ]      # Application definition    INSTALLED_APPS = [      'django.contrib.admin',      'django.contrib.auth',      'django.contrib.contenttypes',      'django.contrib.sessions',      'django.contrib.messages',      'django.contrib.staticfiles',            'rest_framework',      'corsheaders',      'api',  ]    MIDDLEWARE = [      'whitenoise.middleware.WhiteNoiseMiddleware',      'django.middleware.security.SecurityMiddleware',      'corsheaders.middleware.CorsMiddleware',      'django.contrib.sessions.middleware.SessionMiddleware',      'django.middleware.common.CommonMiddleware',      'django.middleware.csrf.CsrfViewMiddleware',      'django.contrib.auth.middleware.AuthenticationMiddleware',      'django.contrib.messages.middleware.MessageMiddleware',      'django.middleware.clickjacking.XFrameOptionsMiddleware',  ]    CORS_ALLOW_ALL_ORIGINS = True    ROOT_URLCONF = 'mynotes.urls'    TEMPLATES = [      {          'BACKEND': 'django.template.backends.django.DjangoTemplates',          'DIRS': [              os.path.join(BASE_DIR, 'frontend/build')          ],          'APP_DIRS': True,          'OPTIONS': {              'context_processors': [                  'django.template.context_processors.debug',                  'django.template.context_processors.request',                  'django.contrib.auth.context_processors.auth',                  'django.contrib.messages.context_processors.messages',              ],          },      },  ]    WSGI_APPLICATION = 'mynotes.wsgi.application'      # Database  # https://docs.djangoproject.com/en/4.0/ref/settings/#databases    DATABASES = {      'default': {          'ENGINE': 'django.db.backends.sqlite3',          'NAME': BASE_DIR / 'db.sqlite3',      }  }      # Password validation  # https://docs.djangoproject.com/en/4.0/ref/settings/#auth-password-validators    AUTH_PASSWORD_VALIDATORS = [      {          'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',      },      {          'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',      },      {          'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',      },      {          'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',      },  ]      # Internationalization  # https://docs.djangoproject.com/en/4.0/topics/i18n/    LANGUAGE_CODE = 'en-us'    TIME_ZONE = 'UTC'    USE_I18N = True    USE_TZ = True      # Static files (CSS, JavaScript, Images)  # https://docs.djangoproject.com/en/4.0/howto/static-files/    STATIC_URL = '/static/'    STATICFILES_DIRS = [      os.path.join(os.path.join(BASE_DIR, 'frontend'), 'build', 'static'),  ]      # Default primary key field type  # https://docs.djangoproject.com/en/4.0/ref/settings/#default-auto-field    DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'    # Heroku: Update database configuration from $DATABASE_URL.  import dj_database_url    db_from_env = dj_database_url.config(conn_max_age=500)  DATABASES['default'].update(db_from_env)  

mynotes/urls.py)

from django.contrib import admin  from django.urls import path, include, re_path  from django.views.generic import TemplateView    urlpatterns = [      path('admin/', admin.site.urls),      path('api/', include('api.urls')),      re_path('^.*', TemplateView.as_view(template_name='index.html')),  ]  

output of termial after execute heroku open)

enter image description here


In 3000 port, 8000 port, my operation test worked as intended. (CRUD with RESTFUL api)

And i did the following in heroku, there was no error. Except the heroku open

heroku login

heroku create [my server name]

git push heroku main

heroku run python manage.py migrate

heroku run python manage.py createsuperuser

heroku run python manage.py collectstatic


And i installed buildpack.

heroku buildpacks:set heroku/python

heroku config:set DISABLE_COLLECTSTATIC=1


I've been stuck here for 2 days, and I would like some help on this!

Thanks you

Redux RTK reloading state when calling getSelectors

Posted: 10 Jun 2022 07:48 AM PDT

I am new to Redux RTK so the problem might not exactly be on calling getSelectors(). However, when I'm using the state that comes from getSelectors() it reloads the entire state.

Problem The baseline is that I have different Setup objects that I'm calling based on the documentId. These Setup objects are quite large so in the getSetups I am only fetching some basic properties. Then, when the user selects a specific Setup from the dropdown I want to save it in the setupSlice. But when I trigger the dispatch(setSetup(data)) the RTK reloads all the Setups. I encounter an infinite loop when after fetching all the Setup objects I want to automatically assign the default Setup to the setupSlice.

Extra Ideally when I assign a Setup to the setupSlice I would like to call the getSetup from RTK to fetch the entire Setup object of that specific Setup and store it in the setupSlice.

I am not sure if this is suppose to be happening but is there anyway to stop it? Otherwise is there any recommendation so I can move forward?

This is the component I'm trying to generate:

const SetupDropdown = () => {        const dispatch = useDispatch()      const { documentId } = useParams()        const { data, isFetching } = useGetSetupsQuery({ documentId })        let setupsMenu;      const { selectAll: selectAllSetups } = getSelectors({documentId})      const allSetups = useSelector(selectAllSetups)        if (!isFetching) {          const defaultSetup = allSetups.find((setup) => setup.default)          setupsMenu = allSetups.map(setup => {              return (<MenuItem value={setup.id}>{setup.name}</MenuItem>)          })          dispatch(setSetup(defaultSetup))      }                const setupId = useSelector(selectSetupId)        const handleChange = async (event) => {          // Here I ideally call the getSetup RTK Query to fetch the entire information of the single setup          const data = {              id: event.target.value,              name: 'Random name'          }          dispatch(setSetup(data))        };        return (          <FormControl sx={{ minWidth: 200 }} size="small">              <InputLabel>Setup</InputLabel>              <Select                  value={setupId}                  onChange={handleChange}                  label="Setup"              >                  {setupsMenu}              </Select>          </FormControl>      )  }    export default SetupDropdown;  

This is the setupApiSlice:

const setupsAdapter = createEntityAdapter({      sortComparer: (a, b) => b.date.localeCompare(a.date)  })    const initialState = setupsAdapter.getInitialState()    export const setupsApiSlice = apiSlice.injectEndpoints({      tagTypes: ['Setup'],      endpoints: builder => ({          getSetups: builder.query({              query: ({ documentId }) => ({                  url: `/documents/${documentId}/setups`,                  method: 'GET'              }),              transformResponse: responseData => {                  return setupsAdapter.setAll(initialState, responseData)              },              providesTags: (result, error, arg) => [                  { type: 'Setup', id: "LIST" },                  ...result.ids.map(id => ({ type: 'Setup', id }))              ]          }),          getSetup: builder.query({              query: ({ documentId, setupId }) => ({                  url: `/documents/${documentId}/setups/${setupId}`,                  method: 'GET'              })          })      })  })    export const {      useGetSetupsQuery,      useGetSetupQuery  } = setupsApiSlice    // Define function to get selectors based on arguments (query) of getSetups  export const getSelectors = (      query,    ) => {      const selectSetupsResult = setupsApiSlice.endpoints.getSetups.select(query)          const adapterSelectors = createSelector(        selectSetupsResult,        (result) => setupsAdapter.getSelectors(() => result?.data ?? initialState)      )          return {        selectAll: createSelector(adapterSelectors, (s) =>          s.selectAll(undefined)        ),        selectEntities: createSelector(adapterSelectors, (s) =>          s.selectEntities(undefined)        ),        selectIds: createSelector(adapterSelectors, (s) =>          s.selectIds(undefined)        ),        selectTotal: createSelector(adapterSelectors, (s) =>          s.selectTotal(undefined)        ),        selectById: (id) => createSelector(adapterSelectors, (s) =>          s.selectById(s, id)        ),      }    }  

This is the setupSplice:

const initialState = {      name: null,      filters: [],      data: {},      status: 'idle', //'idle' | 'loading' | 'succeeded' | 'failed'      error: null  }    const setupSlice = createSlice({      name: 'setup',      initialState,      reducers: {          setSetup: (state, action) => {              console.log('Dispatch')                const setup = action.payload;              console.log(setup)              state.id = setup.id;              state.name = setup.name;              state.filters = setup.filters;              state.data = setup.state;              state.status = 'succeeded';          }      }  })    export const { setSetup } = setupSlice.actions;    export const selectSetupId = (state) => state.setup.id;  export const selectSetupName = (state) => state.setup.name;  export const selectSetupFilters = (state) => state.setup.filters;  export const selectSetupData = (state) => state.setup.data;    export default setupSlice.reducer;  

Sdl2 window is either transparent or rendering whats behind it on arch linux

Posted: 10 Jun 2022 07:48 AM PDT

The code looks like this:

#include"global.h"  static const int width = 800;  static const int height = 600;  int main (int argc, char **argv)  {      SDL_Init(SDL_INIT_VIDEO);        // Create the window      SDL_Window *window = SDL_CreateWindow("Ferengine",SDL_WINDOWPOS_UNDEFINED,SDL_WINDOWPOS_UNDEFINED,width,height, SDL_WINDOW_OPENGL);      // renderer      SDL_Renderer *renderer = SDL_CreateRenderer(window,-1,SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);      // settings      SDL_SetRenderDrawColor(renderer,255,0,0,255);      SDL_SetWindowOpacity(window,1.0f);      // loop       bool running = true;      SDL_Event event;      while(running)       {          while(SDL_PollEvent(&event))          {              if(event.type == SDL_QUIT)              {                  running = false;              }          }      }      SDL_RenderClear(renderer);        SDL_RenderPresent(renderer);      // release resources      SDL_DestroyRenderer(renderer);      SDL_DestroyWindow(window);      SDL_Quit();        return 0;  }  

global.h looks like this:

// SDL  #include <stdbool.h>  #include <SDL2/SDL.h>  #include <stdio.h>  #include <stdlib.h>  

my make file looks like this

CC=gcc  S_DIR=Source  B_DIR=Build  build:      $(CC) \          -o \          $(B_DIR)/Ferengine \          $(S_DIR)/main.c \          -lSDL2  clean:      rm -f $(B_DIR)/*  

file structure:

  • main folder

  • Makefile

  • source | build/executable

    • source/main.c | source/global.h

when I run make and then run the executable either the window is rendering whats behind it or it is transparent I saw another question like this but its answer did not work. I am using sdl2 version 2.0.22-2.

jinja2.exceptions.UndefinedError: 'group' is undefined

Posted: 10 Jun 2022 07:49 AM PDT

I have a problem where group is undefined when calling it in html can anyone help to fix this problem?

This is the code for the webpage:

@app.route('/<int:group_id>', methods=['GET', 'POST', 'PUT'])  @login_required  def group(group_id):      if not current_user.is_authenticated:          return redirect(url_for('index'))      group = Chatgroup.query.get(group_id)      check = False      for users_in_group in group.users:          if (users_in_group.username == current_user.username):              check = True      if (check == False):          return redirect(url_for('index'))      posts = group.posts      users = group.users      form = MessageForm()      if form.is_submitted():          if len(message.body.data) > 0:              message = Message(body=message.body.data, author=current_user, group = group,              username=current_user.username)              db.session.add(message)              db.session.commit()              return render_template('groupchat.html', group=group, posts=posts, message=message)          else:              flash('Empty message')              return render_template('groupchat.html', group=group, posts=posts, message=message)  

This is the HTML code:

{% extends "base.html" %}    {% block app_content %}      <h1>{{_('Messages') }}</h1>      <h2>Online gebruikers</h2>      <body>          <p><a href="{{url_for('groupchat')}}">Make a new groupchat</p>          {% for online_users_element in online_users %} <!--check if users are online every-->          <p><a href="{{ url_for('group', group_id = group.id) }}">              {{online_users_element.username}}          </p>          {% endfor %}      </body>  {% endblock %}  

And this is the error message:

jinja2.exceptions.UndefinedError: 'group' is undefined

JsonPath Conditions from another array | JMETER

Posted: 10 Jun 2022 07:48 AM PDT

Using JSONPath I'm trying to fetch pools > id > options > id

based on this condition: componentGroups > id > poolId = pools > id "poolId": "cb720883-04f1-4be4-a416-2f0c75230768", = "id": "cb720883-04f1-4be4-a416-2f0c75230768",

I mean one of those IDs from the payload below:

Example 1: for "poolId": "cb720883-04f1-4be4-a416-2f0c75230768", pools > id should be:

"id": "355548b5-b032-44f5-bf91-acfe987be3aa",

"id": "178709f3-f47d-4a6f-9aa1-f27958c628ea",

"id": "b0a227b0-2bf8-47cc-8422-5114edfb5c41",

"id": "ee8e2b49-7539-4ddd-90a4-6745b1385733",

Example 2: for "poolId": "fc2ab5c-c709-4b1d-81a7-e61a8d120d8d", pools > id should be:

"id": "cf85f8ca-7686-4585-b154-2e069e7bf976",

"id": "b0f6c744-af0a-4395-a2d5-bfaf39aeeab5",

the JSON extractor I use:

$.data[?(@.productId=='1231a29f0c3-9180-4cd7-a5c7-282f6689c31a')].pools..options[?(@.componentGroups..poolId = .pools.*.id)].id

{  "error": null,  "data": [      {          "id": "d8af6a50-1bf4-4792-97ea-f9c819c60f8e",          "productId": "1231a29f0c3-9180-4cd7-a5c7-282f6689c31a",          "productName": "Acme One Hundred Percent",          "pools": [              {                  "id": "cb720883-04f1-4be4-a416-2f0c75230768",                  "options": [                      {                          "id": "355548b5-b032-44f5-bf91-acfe987be3aa",                          "optionAmount": 1,                          "optionIndex": 0,                          "optionName": "$1",                          "audio": null,                          "image": "https://123./1-dollar-ball.png"                      },                      {                          "id": "178709f3-f47d-4a6f-9aa1-f27958c628ea",                          "optionAmount": 5,                          "optionIndex": 1,                          "optionName": "$5",                          "audio": null,                          "image": "https://123/5-dollar-ball.png"                      },                      {                          "id": "b0a227b0-2bf8-47cc-8422-5114edfb5c41",                          "optionAmount": 10,                          "optionIndex": 2,                          "optionName": "$10",                          "audio": null,                          "image": "https://123/10-dollar-ball.png"                      },                      {                          "id": "ee8e2b49-7539-4ddd-90a4-6745b1385733",                          "optionAmount": 20,                          "optionIndex": 3,                          "optionName": "$20",                          "audio": null,                          "image": "https://123/20-dollar-ball.png"                      }                  ],                  "poolName": "Panel Amounts",                  "audio": null,                  "image": null              },              {                  "id": "2f1403cb-df3e-4fbf-b594-352f76943895",                  "options": [                      {                          "id": "a5dc21e7-cc4e-4e93-b74f-1f4b19f322ef",                          "optionAmount": 0,                          "optionIndex": 0,                          "optionName": "Heads",                          "audio": null,                          "image": "https://123/OneHundredPercent/LibertyBellCoin.png"                      }                  ],                  "poolName": "Heads",                  "audio": null,                  "image": null              }          ],          "componentGroups": [              {                  "id": "126140ac-1cfa-40a3-9ab7-1a84469ab0db",                  "poolId": "cb720883-04f1-4be4-a416-2f0c75230768",                  "components": [                      {                          "id": "8291365c-df9f-4d7b-8f34-be0b500065ee",                          "componentConsumerSelectionType": 2,                          "componentName": "Panel Amount",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Panel Amount Components",                  "audio": null,                  "image": null              },              {                  "id": "03ed0e60-8fba-4e01-83c2-ce960bf5e889",                  "poolId": "2f1403cb-df3e-4fbf-b594-352f76943895",                  "components": [                      {                          "id": "3f4b1a61-5d80-4b78-8c20-9f2b92022cf4",                          "componentConsumerSelectionType": 1,                          "componentName": "Heads",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Heads Components",                  "audio": null,                  "image": null              }          ]      },      {          "id": "b203c9bc-1a7e-4fe5-9b5b-d8021ea118c6",          "productId": "123557a97a5-ef93-46b7-868c-4f26439fbcb9",          "productName": "Acme Pick 3",          "pools": [              {                  "id": "efc2ab5c-c709-4b1d-81a7-e61a8d120d8d",                  "options": [                      {                          "id": "cf85f8ca-7686-4585-b154-2e069e7bf976",                          "optionAmount": 0.5,                          "optionIndex": 0,                          "optionName": "$.50",                          "audio": null,                          "image": "https://123/01.png"                      },                      {                          "id": "b0f6c744-af0a-4395-a2d5-bfaf39aeeab5",                          "optionAmount": 1,                          "optionIndex": 1,                          "optionName": "$1",                          "audio": null,                          "image": "https://123/01.png"                      }                  ],                  "poolName": "Panel Amounts",                  "audio": null,                  "image": null              },              {                  "id": "60b8910b-baa7-4776-bb7a-05f277887a11",                  "options": [                      {                          "id": "d51485a5-1cb4-4aca-8ad9-551ac0bddce3",                          "optionAmount": 0,                          "optionIndex": 0,                          "optionName": "Exact Order",                          "audio": null,                          "image": "https://123/tilestraight.png"                      },                      {                          "id": "4ed3eaa0-7b58-47db-8072-bbe78f0e40cf",                          "optionAmount": 0,                          "optionIndex": 1,                          "optionName": "Any Order",                          "audio": null,                          "image": "https://123/tilebox.png"                      },                      {                          "id": "46571384-8bea-404b-8aa1-c0965fa0bed1",                          "optionAmount": 0,                          "optionIndex": 2,                          "optionName": "50/50",                          "audio": null,                          "image": "https://123/tilefiftyfifty.png"                      },                      {                          "id": "325d52e8-ed3f-4a69-8f09-8e49a53608dd",                          "optionAmount": 0,                          "optionIndex": 3,                          "optionName": "Front Pair",                          "audio": null,                          "image": "https://123/tilefrontpair.png"                      },                      {                          "id": "5960a4af-0b74-4c2b-b8a4-2dc4033e96a2",                          "optionAmount": 0,                          "optionIndex": 4,                          "optionName": "Split Pair",                          "audio": null,                          "image": "https://123/tilesplitpair.png"                      },                      {                          "id": "c2a817ca-1bd8-4b5b-8f6f-e12f3286ea54",                          "optionAmount": 0,                          "optionIndex": 5,                          "optionName": "Back Pair",                          "audio": null,                          "image": "https://123/tilebackpair.png"                      }                  ],                  "poolName": "Play Types",                  "audio": null,                  "image": null              },              {                  "id": "53fb7c9a-17dd-4f95-94b0-0110433404a2",                  "options": [                      {                          "id": "fae5912c-9d4a-466b-b515-d60b3186c2e5",                          "optionAmount": 0,                          "optionIndex": 0,                          "optionName": "0",                          "audio": null,                          "image": "https://123/tilegreen00.png"                      },                      {                          "id": "42055e6a-f08a-48b4-a7c8-41b87f11ca51",                          "optionAmount": 0,                          "optionIndex": 1,                          "optionName": "1",                          "audio": null,                          "image": "https://123/tilegreen01.png"                      },                      {                          "id": "ea28a99c-6068-4d67-8e2d-c6857656b499",                          "optionAmount": 0,                          "optionIndex": 2,                          "optionName": "2",                          "audio": null,                          "image": "https://123/tilegreen02.png"                      },                      {                          "id": "bf5e29ea-a377-46ad-865d-e4585d652b13",                          "optionAmount": 0,                          "optionIndex": 3,                          "optionName": "3",                          "audio": null,                          "image": "https://123/tilegreen03.png"                      },                      {                          "id": "41c26a4b-ce86-4c28-ae5a-35c13a539fbc",                          "optionAmount": 0,                          "optionIndex": 4,                          "optionName": "4",                          "audio": null,                          "image": "https://123/tilegreen04.png"                      },                      {                          "id": "fcb34895-25b9-4e5e-8a00-652febfcd31e",                          "optionAmount": 0,                          "optionIndex": 5,                          "optionName": "5",                          "audio": null,                          "image": "https://123/tilegreen05.png"                      },                      {                          "id": "0f0cf6ef-a7b7-4d11-98c4-0ea2cfd32f39",                          "optionAmount": 0,                          "optionIndex": 6,                          "optionName": "6",                          "audio": null,                          "image": "https://123/tilegreen06.png"                      },                      {                          "id": "3db2fafd-698c-44bb-a4be-f9e43ee6248b",                          "optionAmount": 0,                          "optionIndex": 7,                          "optionName": "7",                          "audio": null,                          "image": "https://123/tilegreen07.png"                      },                      {                          "id": "25d17003-9db8-407b-b6e2-65c0c2f014a0",                          "optionAmount": 0,                          "optionIndex": 8,                          "optionName": "8",                          "audio": null,                          "image": "https://123/tilegreen08.png"                      },                      {                          "id": "37232037-ca87-4f2e-99a4-40cbfec2bf62",                          "optionAmount": 0,                          "optionIndex": 9,                          "optionName": "9",                          "audio": null,                          "image": "https://123/tilegreen09.png"                      }                  ],                  "poolName": "Numbers",                  "audio": null,                  "image": null              }          ],          "componentGroups": [              {                  "id": "85ee94f1-fbcf-4392-aa08-5320abc13dcc",                  "poolId": "efc2ab5c-c709-4b1d-81a7-e61a8d120d8d",                  "components": [                      {                          "id": "a3d14456-64cb-4cae-b749-133c4b6a476a",                          "componentConsumerSelectionType": 2,                          "componentName": "Panel Amount",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Panel Amount Components",                  "audio": null,                  "image": null              },              {                  "id": "2be68fe5-2872-4631-b994-bb8e691db233",                  "poolId": "60b8910b-baa7-4776-bb7a-05f277887a11",                  "components": [                      {                          "id": "d50aeef2-4821-4247-b712-585a2ba55f81",                          "componentConsumerSelectionType": 2,                          "componentName": "Play Type",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Play Type Components",                  "audio": null,                  "image": null              },              {                  "id": "9b3433e8-48f3-4b37-bccb-e0ed5c55ffe8",                  "poolId": "53fb7c9a-17dd-4f95-94b0-0110433404a2",                  "components": [                      {                          "id": "ed6a4930-2558-414d-b47f-a3f95e719f60",                          "componentConsumerSelectionType": 0,                          "componentName": "Front Number",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Front Number Components",                  "audio": null,                  "image": null              },              {                  "id": "6ca0c5d1-0939-4603-9c0f-d6be5171a068",                  "poolId": "53fb7c9a-17dd-4f95-94b0-0110433404a2",                  "components": [                      {                          "id": "e0957685-22d8-4827-a3db-a1e2e0b5b76b",                          "componentConsumerSelectionType": 0,                          "componentName": "Middle Number",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Middle Number Components",                  "audio": null,                  "image": null              },              {                  "id": "e2ed942b-98e2-4d91-a500-ab5884acf66f",                  "poolId": "53fb7c9a-17dd-4f95-94b0-0110433404a2",                  "components": [                      {                          "id": "98b3c77a-bb25-4a45-a2ca-c90a79c75ecb",                          "componentConsumerSelectionType": 0,                          "componentName": "Back Number",                          "audio": null,                          "image": null                      }                  ],                  "componentGroupName": "Back Number Components",                  "audio": null,                  "image": null              }          ]      }  ]  

}

Which property changes punctuation color/foreground in monaco editor defineTheme?

Posted: 10 Jun 2022 07:48 AM PDT

hope this is a silly question and easy to solve. I'm trying to make a custom code editor with monaco editor in react, so I'm using the @monaco-editor/react library. To customize it I went with the defineTheme function provided by the base monaco editor library, and this is what I got:

monaco.editor.defineTheme('my-theme', {    base: 'vs',    inherit: true,    rules: [],    colors: {      foreground: '#cccce5',      'editor.foreground': '#cccce5',      'textSeparator.foreground': '#cccce5',      'editor.background': '#00004c',      'editorCursor.foreground': '#00000050',      'editor.lineHighlightBackground': '#00000050',      'editorLineNumber.foreground': '#008800',      'editor.selectionBackground': '#00000050',      'editor.inactiveSelectionBackground': '#00000050',      'editorWidget.background': '#00000050',      'editorWidget.foreground': '#cccce5',    },  })  

It's looking pretty good until now, but the one problem I have is I have white text when writing code, but all the punctuation and symbols are black, and I can't find which property changes this color. I'm mainly using this tool from microsoft to learn how I'm supposed to customize it: https://microsoft.github.io/monaco-editor/playground.html#customizing-the-appearence-exposed-colors

You can get in that website, replace the default code with mine, then below that change language from 'text-plain' to 'javascript', then click run and you'll see the issue I'm having

How to detect which files which have been added to a directory over time?

Posted: 10 Jun 2022 07:49 AM PDT

I am reading a directory of 180 files and everyday a file is added. I need to first read all files for today, but tomorrow I need to run to code and have it not return any old files that have already been read. Is there anyway to do this? Here is my current loop to iterate and read through all the files.

#iterating through all the files  for file in os.listdir():      #check if file is in .ls format or not      if file.endswith(".ls"):          filepath = f"{path}\{file}"          #calling the read function          axisMonitorAverage(filepath)  

How can I get a value corresponding to the maximum average in this SQL exercise?

Posted: 10 Jun 2022 07:49 AM PDT

I tried to solve this exercise, but I'm stuck when it comes to show the CCode where I have the maximum average of students. I attach the pic of the exercise and the database tables. Thanks in advance.database enter image descriptioexercise here

Here's my code:

SELECT CO.TCode, CO.CCode  FROM COURSE CO  WHERE CO.CCode NOT IN(    SELECT CCode    FROM COURSE    WHERE Topic <> 'database')  AND CO.CCode =(    SELECT C1.CCode    FROM (SELECT CCode, AVG(AttendingStudent#) MEDIA         FROM LECTURE         GROUP BY CCode) C1    WHERE MAX(C1.MEDIA) AND C1.CCode = CO.CCode  )  

I don't think the maximum in the WHERE clause can work.

unstack stacked ggplot legend

Posted: 10 Jun 2022 07:48 AM PDT

I'm working with a chemistry dataset, where I have 11 different chemicals, here labeled under the column c1,c2,...c11

I have made pie charts using library(ggplot2) , and would like to do 2 things with my plot.

  1. Display all variables in the legend in a horizontal fashion (done), and not have them stacked (not done), as you see in my example. Having just one line would be great. 2 lines could also be acceptable.
  2. Change colors to be color-blind friendly

Here is a pretend dataset we can work with so you can see what I have at this point. I have tried searching "legend margins" to increase the area the legend is plotted on, but to no avail.

data <- read.delim("https://pastebin.com/raw/MS5GLAxa", header = T)

  ggplot(data, aes(x="", y=ratio, fill=chemical)) +    geom_bar(stat="identity", width=1,position = position_fill()) + facet_wrap(~treatment, nrow=1)+    coord_polar("y", start=0)+    theme_void(base_size = 20)+    theme(legend.position=c(0.5, 1.2),legend.direction = "horizontal")+    theme(plot.margin=unit(c(0,0,0,0), 'cm'))  

enter image description here

Some side bonuses here would be to be able to:

  • increase the size of the pie chart (I believe I achieved this with making my margins as small as possible on the sides)
  • have the pie chart have solid colors, and no white lines in graph

read csv files in R with rowname according to filename

Posted: 10 Jun 2022 07:48 AM PDT

Let's take three files with differing row numbers in with following contents.

file1.csv  # civet   woodpecker  #     1            3  #     3            4    file2.csv  # civet   woodpecker  #     6            4    file3.csv  # civet   woodpecker  #     0            2  #     5            4  #     3            9  #     2            4    

I want to read it in to make a single new.rds file with the expected output:

new.rds  # filename civet   woodpecker  #    file1     1            3  #    file1     3            4  #    file2     6            4  #    file3     0            2  #    file3     5            4  #    file3     3            9  #    file3     2            4  

I could do it for equal sized .csv. But finding hard, as in the real case I don't know the row numbers, so the following doesn't work!

files <- list.files(path=".", pattern="file.*csv"); files;  files <- set_names(files, paste0("file", 1:3))    files %>% map_dfr(read_csv, .id = 'dataset') %>%       saveRDS(file = "new.rds");  

Will appreciate any input!

Save only those Outlook Email Attachments from Inbox, which are sent by specific sender on specific dates

Posted: 10 Jun 2022 07:48 AM PDT

Hi I have more than 3500 Emails in my OL Inbox. My conditions are.

  1. I Want to save all attachments from some specific senders for some specifics dates as per my requirements Using Outlook VBA.
Sub Save_OutLook_Attachments_To_Folder()  Dim App As Outlook.Application  Dim MyFolder As Outlook.MAPIFolder  Dim NS As Outlook.Namespace  Dim Msg As Outlook.MailItem  Dim Attachments As Outlook.Attachments  Dim Items As Outlook.Items  Dim i As Long  Dim lngCount As Long  Dim File As String  Dim FolderPath As String  Dim Sender_Email As String ''''    Sender_Email = "someone@somewhere.com" '''''''objMail.SenderEmailAddress  ''''''  FolderPath = ThisWorkbook.Path & "\" & "\Outlook Attachments\" '''Change the Destination Folder Path  Set App = New Outlook.Application  Set NS = App.GetNamespace("MAPI")  Set MyFolder = NS.GetDefaultFolder(olFolderInbox) ''''''''''''  Set Items = MyFolder.Items  Items.Sort Property:="[ReceivedTime]", Descending:=True  FolderPath = FolderPath  For Each Msg In MyFolder.Items  If Msg.SenderEmailAddress = Sender_Email Then '''''  Set Attachments = Msg.Attachments  lngCount = Attachments.Count  End If  Next  If lngCount > 0 Then  For i = lngCount To 1 Step -1  File = Attachments.Item(i).FileName  File = FolderPath & File  Attachments.Item(i).SaveAsFile File  Next  End If  End Sub      

Would appreciate if anyone can help me in this case.

Thanks in advance.

How to pass API data into state in React

Posted: 10 Jun 2022 07:49 AM PDT

I have been learning react for the past week and i am finding it really interesting. However, i am stuck. I am making a call to an API and using a custom fetch hook to get the data. This is represented where i have declaring a constant and extracting the data along with other stuff which are general functions to show error and show a loader before the data is displayed. Then i pass this data as a prop into another component called animelist which is responsible for mapping through the data. What i want to do now is to pass this data into a state so i can modify it later as required. I then want to pass the state as a prop to the animelist component rather than the data directly. I have declared a piece of state called topAnime. Then i am using the useEffect hook to set the topAnime state to the data. However i get an error in the console saying that data is basically null which i don't understand how. I feel like it might be something to do with the dependency array in the useEffect hook but i am not sure. Any guidance would be much appreciated. Below is my code. Thanks.

import { useFetch } from '../hooks/useFetch'  import Error from "./Error"  import Spinner from "../components/spinner/Spinner"  import AnimeList from "../components/animelist/AnimeList"  import Header from '../Header'  import './TopAnime.css'  import { useState } from 'react'  import { useEffect } from 'react'  function TopAnime() {      const [topAnime, setTopAnime] = useState([])    const {data, isPending, error} = useFetch('https://api.jikan.moe/v4/top/anime')    useEffect(() => {      (async () => {        setTopAnime(data);      })();    }, [data]);      return (      <div className="top-anime">      <Header/>      {error && <Error/>}      {isPending && <Spinner/>}      {data && <h1>Top Anime</h1>}      {data && <AnimeList animelist={topAnime}/>}    </div>    )  }    export default TopAnime  

Here is the error i get enter image description here

why map function not works in a nested objects?

Posted: 10 Jun 2022 07:49 AM PDT

I have an array contains objects

let sportTag = [  { id: 1, name: 'FOOTBALL', found: false },  { id: 2, name: 'TENNIS', found: false },  { id: 3, name: 'BASKET', found: false },  ]  

I have also another array contains objects and for every object a field (sports) that is an array

let person = [  {      id: 1,      name: "Mark",      age: 23,      sports: [          "volleyball",          "rugby",          "tennis"      ],  },  {      id: 2,      name: "Rupert",      age: 40,      sports: [          "golf"      ],  },  {      id: 3,      name: "John",      age: 31,      sports: [          "football",          "golf",          "rugby",          "tennis",      ],  },  

]

I would like to change sportTag found field to true when sportTag name is equal to every person sport. I tried with a nested map

const result = sportTag.map(st => {  person.map(p => {      p.sports.map(s => {          if (st.name.toLocaleUpperCase() === s.toLocaleUpperCase()) {              return {                  ...st, found: true              }          }          return s      })      return p  })  return st  })    console.log(sportTag)  //OUTPUT  // { id: 1, name: 'FOOTBALL', found: false },  // { id: 2, name: 'TENNIS', found: false },  // { id: 3, name: 'BASKET', found: false }  console.log(result)  //OUTPUT  // { id: 1, name: 'FOOTBALL', found: false },  // { id: 2, name: 'TENNIS', found: false },  // { id: 3, name: 'BASKET', found: false }  

why not changes happends? I'm expecting result is

{ id: 1, name: 'FOOTBALL', found: true},  { id: 2, name: 'TENNIS', found: true},  { id: 3, name: 'BASKET', found: false }  

Delete Lines containing a string after a Matching patterns?

Posted: 10 Jun 2022 07:49 AM PDT

In my yaml file , i want to delete a block using Sed After matching a pattern .

Example:

Keypages:  - name: Key pages  LaunchPad:  - name: "\U0001F680Launch Pad"  - location: US  Microservices:  - name: 'Micro services '  

In this example , when matching LaunchPad(for example) in the start of a line, i want to delete it ans its block ( for me the lines starting with - and then stop when not finding -)

Result :

Keypages:  - name: Key pages  Microservices:  - name: 'Micro services '  

thank you

T4MVC throws a serialization error when trying to "Run Custom Tool"

Posted: 10 Jun 2022 07:48 AM PDT

Today, after upgrading Visual Studio to 17.2.3 I get this error when trying to run T4MVC:

Running transformation: System.Runtime.Serialization.SerializationException: Type 'Microsoft.VisualStudio.Platform.WindowManagement.DTE.WindowBase' in Assembly 'Microsoft.VisualStudio.Platform.WindowManagement, Version=17.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' is not marked as serializable.

I have seen others with the same issue, but the solutions suggested there do not solve it for me: VS2022 17.2.0 Preview 2.0: T4 template serialization exception when accessing projects, etc T4 error when running Custom Tool in vs2017

I have tried removing/adding the project, I've tried restarting Windows, I have no idea where to look as this was working without issues previously.

Any suggestions?

EDIT: After debugging more I can see that the inner exception is: Unable to cast COM object of type 'System.__ComObject' to interface type 'Microsoft.VisualStudio.Shell.Interop.IVsQueryEditQuerySave2'

If Else and Exception in JForm

Posted: 10 Jun 2022 07:49 AM PDT

I am trying to throw the exception when the user enters other integer than 1 and 2 into the jtextfield,

public class SearchEmployee extends javax.swing.JFrame  {        /**       * Creates new form SearchEmployee       */      public SearchEmployee() {                    initComponents();        }       // A utility function to check      // whether a code is valid or not         public static boolean isCodeValid(String id)          throws IdNotFoundException      {          if(!id.equals("1")){              throw new IdNotFoundException();          }else if(!id.equals("2")){              throw new IdNotFoundException();          }          else{              return true;          }      }  

and supposedly after users clicked the search button, they are supposedly going to the new page where employee details will be displayed.

 private void searchActionPerformed(java.awt.event.ActionEvent evt) {                                                 // TODO add your handling code here:          String id = staffID.getText();            try{              if(isCodeValid("1")){                  EmployeeDetails emp = new EmployeeDetails();                  emp.name.setText("Sky Peach");                  emp.email.setText("skypeach@gmail.com");                  emp.address.setText("Pangsapuri Sri Puteri");                  emp.phoneNo.setText("1999999999");                  emp.department.setText("IT");                  emp.designation.setText("Software Developer");                  emp.show();              }else if(isCodeValid("2")){                  EmployeeDetails emp = new EmployeeDetails();                  emp.name.setText("Sky Orange");                  emp.email.setText("skyorange@gmail.com");                  emp.address.setText("Pangsapuri Sri Puteri");                  emp.phoneNo.setText("2999999999");                  emp.department.setText("IT");                  emp.designation.setText("Software Engineer");                  emp.show();              }          }catch (IdNotFoundException ex) {                  JOptionPane.showMessageDialog(this, ex.getMessage());              }                                        }     

however, the exception was thrown even when i entered the integer 1 and 2. How to fix this error?

Ansible eht1 ipv4 address in j2 template

Posted: 10 Jun 2022 07:49 AM PDT

I'm currently working on a school assignment where i have to configure HAProxy to loadbalance between my two webservers.

I'm deploying the machines via Vagrant in Virtualbox. After this, my Ansible playbook will run and starts off with configuring the webservers. After the webservers are done, it will configure the loadbalancer.

Sadly, i can't manage to get the ipv4 address of both eth1 adapters added to the HAproxy.conf. I'm repeatedly getting the message that ansible can't find the variable inside the hostvars.

TASK [Configuring haproxy]


fatal: [HDVLD-TEST-LB01]: FAILED! => {"changed": false, "msg": "AnsibleUndefinedVariable: 'ansible.vars.hostvars.HostVarsVars object' has no attribute 'ansible_eth1'"}

Adding up to this, HAProxy is not responding on 10.2.2.20:8080 -> Chrome gives me an

ERR_CONNECTION_REFUSED

I hope someone over here can help me out..

I'll paste my code down here.

Vagrantfile

# -*- mode: ruby -*-  # vi: set ft=ruby :    # All Vagrant configuration is done below. The "2" in Vagrant.configure  # configures the configuration version (we support older styles for  # backwards compatibility). Please don't change it unless you know what  # you're doing.  Vagrant.configure("2") do |config|  #  config.ssh.insert_key = false    #webserver   (1..2).each do |i|    config.vm.define "HDVLD-TEST-WEB0#{i}" do |webserver|      webserver.vm.box = "ubuntu/trusty64"      webserver.vm.hostname = "HDVLD-TEST-WEB0#{i}"      webserver.vm.network :private_network, ip: "10.2.2.1#{i}"      webserver.vm.provider :virtualbox do |vb|        vb.memory = "524"        vb.customize ["modifyvm", :id, "--nested-hw-virt", "on"]      end      webserver.vm.provision "shell" do |shell|        ssh_pub_key = File.readlines("#{Dir.home}/.ssh/id_rsa.pub")        shell.inline = <<-SHELL        echo #{ssh_pub_key} >> /home/vagrant/.ssh/authorized_keys        echo #{ssh_pub_key} >> /root/.ssh/authorized_keys      SHELL      end    end    #  config.vm.define "HDVLD-TEST-DB01" do|db_server|  #    db_server.vm.box = "ubuntu/trusty64"  #    db_server.vm.hostname = "HDVLD-TEST-DB01"  #    db_server.vm.network :private_network, ip: "10.2.2.30"  #  end    config.vm.define "HDVLD-TEST-LB01" do |lb_server|      lb_server.vm.box = "ubuntu/trusty64"      lb_server.vm.hostname = "HDVLD-TEST-LB01"      lb_server.vm.network :private_network, ip: "10.2.2.20"      lb_server.vm.provider :virtualbox do |vb|        vb.memory = "524"        vb.customize ["modifyvm", :id, "--nested-hw-virt", "on"]    end      lb_server.vm.provision "shell" do |shell|        ssh_pub_key = File.readlines("#{Dir.home}/.ssh/id_rsa.pub")        shell.inline = <<-SHELL        echo #{ssh_pub_key} >> /home/vagrant/.ssh/authorized_keys        echo #{ssh_pub_key} >> /root/.ssh/authorized_keys      SHELL      end    end    config.vm.provision :ansible do |ansible|      ansible.playbook = "webserver_test.yml"      ansible.groups = {        "webservers" => ["HDVLD-TEST-WEB01", "HDVLD-TEST-WEB02"],        "loadbalancer" => ["HDVLD-TEST-LB01"]      }    end  end  end  

Playbook.yml

- hosts: webservers    become: true    vars_files: vars/default.yml    gather_facts: True    tasks:          # Getting the IP address of eth0 interface      - name: Gather facts from new server        delegate_facts: True        setup:          filter: ansible_eth1.ipv4.address          - name: Debug facts from Server        delegate_facts: True        debug:          var: ansible_eth1.ipv4.address        - name: UPurge        apt: purge=yes        - name: Install latest version of Apache        apt: name=apache2 update_cache=yes state=latest        - name: Install latest version of Facter        apt: name=facter state=latest        - name: Create document root for your domain        file:          path: /var/www/{{ http_host }}          state: directory          mode: '0755'        - name: Copy your index page        template:          src: "files/index.html.j2"          dest: "/var/www/{{ http_host }}/index.html"        - name: Set up virtuahHost        template:          src: "files/apache.conf.j2"          dest: "/etc/apache2/sites-available/{{ http_conf }}"        notify: restart-apache        - name: Enable new site {{ http_host }}        command: a2ensite {{ http_host }}        - name: Disable default site        command: a2dissite 000-default        when: disable_default        notify: restart-apache        - name: "UFW firewall allow HTTP on port {{ http_port }}"        ufw:          rule: allow          port: "{{ http_port }}"          proto: tcp      handlers:      - name: restart-apache        service:          name: apache2          state: restarted    - hosts: loadbalancer    become: true    vars_files: vars/default.yml    gather_facts: true    tasks:      - name: "Installing haproxy"        package:          name: "haproxy"          state: present        - name: "Starting haproxy"        service:          name: "haproxy"          state: started          enabled: yes        - name: "Configuring haproxy"        template:          src: "files/haproxy.conf.j2"          dest: "/etc/haproxy/haproxy.cfg"        notify: restart-haproxy        - name: "UFW firewall allow Proxy on port {{ proxy_port }}"        ufw:          rule: allow          port: "{{ proxy_port }}"          proto: tcp        - name: "UFW firewall allow static port on port {{ staticlb_port }}"        ufw:          rule: allow          port: "{{ staticlb_port }}"          proto: tcp      - name: Gather facts from new Server        setup:          filter: ansible_default_ipv4.address        handlers:      - name: restart-haproxy        service:          name: haproxy          state: restarted  

Haproxy.conf.j2

#---------------------------------------------------------------------  # Example configuration for a possible web application.  See the  # full configuration options online.  #  #   https://www.haproxy.org/download/1.8/doc/configuration.txt  #  #---------------------------------------------------------------------      #---------------------------------------------------------------------  # Global settings  #---------------------------------------------------------------------  global      # to have these messages end up in /var/log/haproxy.log you will      # need to:      #      # 1) configure syslog to accept network log events.  This is done      #    by adding the '-r' option to the SYSLOGD_OPTIONS in      #    /etc/sysconfig/syslog      #      # 2) configure local2 events to go to the /var/log/haproxy.log      #   file. A line like the following can be added to      #   /etc/sysconfig/syslog      #      #    local2.*                       /var/log/haproxy.log      #      log         127.0.0.1 local2          chroot      /var/lib/haproxy      pidfile     /var/run/haproxy.pid      maxconn     1000      user        haproxy      group       haproxy      daemon          # turn on stats unix socket      stats socket /var/lib/haproxy/stats          # utilize system-wide crypto-policies      ssl-default-bind-ciphers PROFILE=SYSTEM      ssl-default-server-ciphers PROFILE=SYSTEM      #---------------------------------------------------------------------  # common defaults that all the 'listen' and 'backend' sections will  # use if not designated in their block  #---------------------------------------------------------------------  defaults      mode                    http      log                     global      option                  httplog      option                  dontlognull      option http-server-close      option forwardfor       except 127.0.0.0/8      option                  redispatch      retries                 3      timeout http-request    10s      timeout queue           1m      timeout connect         10s      timeout client          1m      timeout server          1m      timeout http-keep-alive 10s      timeout check           10s      maxconn                 3000      listen haproxy-monitoring *:{{ proxy_port }}      #---------------------------------------------------------------------  # main frontend which proxys to the backends  #---------------------------------------------------------------------  frontend main      bind *:{{ http_port }}      acl url_static       path_beg       -i /static /images /javascript /stylesheets      acl url_static       path_end       -i .jpg .gif .png .css .js          use_backend static          if url_static      default_backend             app      #---------------------------------------------------------------------  # static backend for serving up images, stylesheets and such  #---------------------------------------------------------------------  backend static      balance     roundrobin      server      static 127.0.0.1:{{ staticlb_port }} check      #---------------------------------------------------------------------  # round robin balancing between the various backends  #---------------------------------------------------------------------  backend app      balance     roundrobin      {% for host in groups['webservers'] %}      {{ hostvars[host].ansible_eth1.ipv4.address }}:{{ http_port }} check      {% endfor %}`  

defaults (vars file)

http_host: "hdvld"  http_conf: "hdvld.conf"  http_port: "80"  proxy_port: "8080"  disable_default: true  staticlb_port: "4331"  

I'm doing something wrong, but i can't find the issue.. Yesterday I have been searching and trying the whole day so there are some quoted pieces off code inside the files, please ignore it..

** Added the inventory file

This is the inventory file

#    Generated by Vagrant                HDVLD-TEST-LB01 ansible_host=127.0.0.1 ansible_port=2200 ansible_user='vagrant' ansible_ssh_private_key_file='/home/web01/VM2022/template/.vagrant/machines/HDVLD-TEST-LB01/virtualbox/private_key'          HDVLD-TEST-WEB02 ansible_host=127.0.0.1 ansible_port=2201 ansible_user='vagrant' ansible_ssh_private_key_file='/home/web01/VM2022/template/.vagrant/machines/HDVLD-TEST-WEB02/virtualbox/private_key'          HDVLD-TEST-WEB01 ansible_host=127.0.0.1 ansible_port=2222 ansible_user='vagrant' ansible_ssh_private_key_file='/home/web01/VM2022/template/.vagrant/machines/HDVLD-TEST-WEB01/virtualbox/private_key'                    [webservers]          HDVLD-TEST-WEB01          HDVLD-TEST-WEB02                    [loadbalancer]          HDVLD-TEST-LB01         

Getting this error when deleting the comment: Can't perform a React state update on an unmounted component

Posted: 10 Jun 2022 07:48 AM PDT

I'm getting this error message when I try to delete a comment.

index.js:1 Warning: Can't perform a React state update on an unmounted component. This is a no-op, but it indicates a memory leak in your application. To fix, cancel all subscriptions and asynchronous tasks in a useEffect cleanup function.      in Comment (at SolutionComments.js:23)  

I am not using useEffect hook, still, it's saying "cancel all subscriptions and asynchronous tasks in a useEffect cleanup function:". Anyone, please help me with this error!

SolutionComments.js

import React, { useState } from "react"  import { useParams } from "react-router-dom"    import { useCollection } from "../../hooks/useCollection"    import Comment from "./Comment"  import CommentForm from "./CommentForm"    const SolutionComments = () => {    const [activeComment, setActiveComment] = useState(null)    const { id } = useParams()    const { documents } = useCollection(`solutions/${id}/comments`, null, 4, [      "createdAt",      "desc",    ])      return (      <div className="mt-10">        <CommentForm docID={id} />        <div>          {documents &&            documents.map((comment) => (              <Comment // Line number 23                key={comment.id}                comment={comment}                replies={comment.replies}                activeComment={activeComment}                setActiveComment={setActiveComment}              />            ))}        </div>      </div>    )  }    export default SolutionComments  

Comment.js

import React from "react"  import moment from "moment"  import { useParams } from "react-router-dom"    import { useAuthContext } from "../../hooks/useAuthContext"  import { useFirestore } from "../../hooks/useFirestore"    import CommentReply from "./CommentReply"  import ReplyForm from "./ReplyForm"    const Comment = ({    comment,    replies,    activeComment,    setActiveComment,    parentId = null,  }) => {    const { deleteSubCollectionDocument } = useFirestore("solutions")    const { id: docID } = useParams()    const { user } = useAuthContext()      const isEditing =      activeComment && activeComment.id === comment.id && activeComment.type === "editing"    const isReplying =      activeComment && activeComment.id === comment.id && activeComment.type === "replying"    const replyId = parentId || comment.id      // handle sub collection document    const handleDelete = async () => {      if (window.confirm("Do you really want to delete this comment?")) {        await deleteSubCollectionDocument(docID, comment.id)      }    }      return (      <div className="my-4 border border-gray-800 rounded p-4">        <div className="flex">          <a            href={`https://github.com/${comment.user.username}`}            target="_blank"            rel="noopener noreferrer"          >            <img              className="w-12 rounded-full border-2 border-gray-800"              src={comment.user.avatarURL}              alt="avatar"            />          </a>          <div className="ml-4 flex-1">            <p className="text-gray-300 mb-2">              <a                href={`https://github.com/${comment.user.username}`}                target="_blank"                rel="noopener noreferrer"              >                {comment.user.displayName                  ? comment.user.displayName                  : comment.user.username}              </a>              <small className="pl-2 text-gray-400">                {moment(comment.createdAt.toDate()).fromNow()}              </small>            </p>            <div className="mt-2 flex">              {user && (                <button                  onClick={() => setActiveComment({ id: comment.id, type: "replying" })}                  className="text-gray-400"                >                  <i className="fas fa-reply"></i>                  <small className="pl-2 font-semibold">Reply</small>                </button>              )}              {user?.uid === comment.user.userID && (                <>                  <button className="text-gray-400" onClick={handleDelete}>                    <i className="fas fa-trash-alt"></i>                    <small className="pl-2 font-semibold">Delete</small>                  </button>                </>              )}            </div>            {isReplying && (              <ReplyForm                docID={docID}                replyingTo={comment.user.username}                id={replyId}                replies={replies}                hasCancelButton                setActiveComment={setActiveComment}              />            )}            {replies &&              replies                .sort((a, b) => a.createdAt.seconds - b.createdAt.seconds)                .map((reply) => (                  <CommentReply                    key={reply.id}                    comment={reply}                    parentReplies={replies}                    parentId={comment.id}                    activeComment={activeComment}                    setActiveComment={setActiveComment}                  />                ))}          </div>        </div>      </div>    )  }    export default Comment  

useFirestore hook

import { useEffect, useReducer, useState } from "react"  import {    addDoc,    collection,    deleteDoc,    doc,    serverTimestamp,    updateDoc,  } from "firebase/firestore"    import { db } from "../firebase/config"    const initialState = {    document: null,    isPending: false,    error: null,    success: null,  }    const firestoreReducer = (state, action) => {    switch (action.type) {      case "IS_PENDING":        return { isPending: true, document: null, success: false, error: null }      case "ADDED_DOCUMENT":        return { isPending: false, document: action.payload, success: true, error: null }      case "DELETED_DOCUMENT":        return { isPending: false, document: null, success: true, error: null }      case "UPDATED_DOCUMENT":        return { isPending: false, document: action.payload, success: true, error: null }      case "ERROR":        return { isPending: false, document: null, success: false, error: action.payload }      default:        return state    }  }    export const useFirestore = (c) => {    const [response, dispatch] = useReducer(firestoreReducer, initialState)    const [isCancelled, setIsCancelled] = useState(false)      // only dispatch is not cancelled    const dispatchIfNotCancelled = (action) => {      if (!isCancelled) {        dispatch(action)      }    }      // add a document    const addDocument = async (doc) => {      dispatch({ type: "IS_PENDING" })        try {        const createdAt = serverTimestamp()        const addedDocument = await addDoc(collection(db, c), {          ...doc,          createdAt,        })        dispatchIfNotCancelled({ type: "ADDED_DOCUMENT", payload: addedDocument })      } catch (error) {        dispatchIfNotCancelled({ type: "ERROR", payload: error.message })      }    }      const updateSubCollectionDocument = async (docID, id, updates) => {      dispatch({ type: "IS_PENDING" })      try {        const updatedDocument = await updateDoc(doc(db, c, docID, "comments", id), updates)        dispatchIfNotCancelled({ type: "UPDATED_DOCUMENT", payload: updatedDocument })        return updatedDocument      } catch (error) {        console.log(error)        dispatchIfNotCancelled({ type: "ERROR", payload: error })        return null      }    }      const deleteSubCollectionDocument = async (docID, id) => {      dispatch({ type: "IS_PENDING" })        try {        await deleteDoc(doc(db, c, docID, "comments", id))        dispatchIfNotCancelled({ type: "DELETED_DOCUMENT" })      } catch (error) {        dispatchIfNotCancelled({ type: "ERROR", payload: error })      }    }      useEffect(() => {      return () => setIsCancelled(true)    }, [isCancelled])      return {      addDocument,      updateSubCollectionDocument,      deleteSubCollectionDocument,      response,    }  }  

useCollection hook:

import { useEffect, useRef, useState } from "react"  // firebase import  import { collection, limit, onSnapshot, orderBy, query, where } from "firebase/firestore"    import { db } from "../firebase/config"    export const useCollection = (c, _q, _l, _o) => {    const [documents, setDocuments] = useState([])    const [isLoading, setIsLoading] = useState(true)    const [error, setError] = useState(null)      // if we don't use a ref --> infinite loop in useEffect    // _query is an array and is "different" on every function call    const q = useRef(_q).current    const o = useRef(_o).current      useEffect(() => {      let ref = collection(db, c)        if (q) {        ref = query(ref, where(...q))      }      if (o) {        ref = query(ref, orderBy(...o))      }      if (_l) {        ref = query(ref, limit(_l))      }        const unsubscribe = onSnapshot(ref, (snapshot) => {        const results = []        snapshot.docs.forEach(          (doc) => {            results.push({ ...doc.data(), id: doc.id })          },          (error) => {            console.log(error)            setError("could not fetch the data")          }        )        // update state        setDocuments(results)        setIsLoading(false)        setError(null)      })        // unsubscribe on unmount      return () => unsubscribe()    }, [c, q, _l, o, isLoading])      return { documents, error, isLoading }  }  

Function input based locking in multithreading

Posted: 10 Jun 2022 07:48 AM PDT

I'm working with a piece of code that processes messages of a queue (using masstransit). Many messages can be processed in parallel. All messages create or modify an object in ActiveDirectory (in this case). All objects need to be validated against the AD schema definitions. (Though its not relevant to the problem, I want to note that we have many customers with custom extension in their AD Schema)

Retrieving the schema information is a slow operation. I want to do it 1 time and then cache it. But with many parallel processing messages. Many messages start getting the schema information before the first succeeds. So too much work is done. For the moment I fixed this with a simple semaphore. See code below.

But that is not a good solution as now only 1 thread can enter this code all the time.

I need something to lock the code 1 time per object and hold off other request until the first retrieval and caching is complete.

What kind of construct will allow me to do that?

private static SemaphoreSlim _lock = new SemaphoreSlim(1, 1);    public ActiveDirectorySchemaObject? GetSchemaObjectFor(string objectClass)  {        //todo: create better solution      _lock.Wait();      try      {          if (_activeDirectorySchemaContainer.HasSchemaObjectFor(              _scopeContext.CustomerId, objectClass) == false)          {              _logger.LogInformation($"Getting and caching schema from AD " +                  $"for {objectClass}");              _activeDirectorySchemaContainer.SetSchemaObjectFor(                  _scopeContext.CustomerId, objectClass,                  GetSchemaFromActiveDirectory(objectClass));          }      }      finally      {          _lock.Release();      }      return _activeDirectorySchemaContainer.GetSchemaObjectFor(          _scopeContext.CustomerId, objectClass);  }  

The following is a possible simplification of the question. In short. I am looking for the proper construct to lock a piece of code for parallel acces for every variation of a input.

A comment mentioned Lazy. Something I have not used before. But reading the docs I see it defers initialization of an object until later. Maybe I could refactor for that. But looking at the code as it currently is, I seem to need an lazy "if" or an lazy "function", but maybe I am over complicating. I find thinking about parallel programming often hurts my head.

Puling 90,000 emails to Google Sheet

Posted: 10 Jun 2022 07:49 AM PDT

I have used https://webapps.stackexchange.com/questions/160232/gmail-bulk-emails-manipulation and modified a bit to copy some 90,000 emails (from a free gmail account) to a sheet.

The script is working only once. After one cycle, the trigger becomes "disabled" with "Unknown reason".

I reduced the batch size, increased the time gap, but still it is getting the same error.

Where am I going wrong?

  /**   * Creates the first trigger to call batchArchiveEmail.    */  function init(){        var Triggers = ScriptApp.getProjectTriggers();      for (var p = 0; p < Triggers.length; p++) {        ScriptApp.deleteTrigger(Triggers[p])      }           ScriptApp      .newTrigger('batchArchiveEmail')      .timeBased()      .after(60 * 1000)      .create();      console.log(`trigger created`)  }    /**   * Archive emails by batches preventing controlling limiting the execution time and     * creating a trigger if there are still threads pending to be archived.   */   function batchArchiveEmail(){    const start = Date.now();    /**      * Own execution time limit for the search and archiving operations to prevent an      * uncatchable error. As the execution time check is done in do..while condition there       * should be enough time to one search and archive operation and to create a trigger      * to start a new execution.      */     const maxTime = 3 * 60 * 1000; // Instead of 25 use 3 for Google free accounts    const batchSize = 50;    let threads, elapsedTime;    var ss=SpreadsheetApp.getActive();    var sheet=ss.getSheetByName("Sheet1");    /** Search and archive threads, then repeat until the search returns 0 threads or the      * maxTime is reached     */     var ms=[];    do {        threads = GmailApp.search('label:inbox before:2022/5/1');      for (let j = 0; j < threads.length; j += batchSize) {        //below code added by me        ms=[];        var messages = threads[j].getMessages();        for (var m = 0; m < messages.length; m++) {          var from = messages[m].getFrom(); //from field          var mId = messages[m].getId();//id field to create the link later          var supportStats = [];          var to = messages[m].getTo();//to field          var time = messages[m].getDate();//date field          var subject = messages[m].getSubject();//subject field          var body=messages[m].getPlainBody();          var tel=[];          tel = body.match(/[\+]?\d{10}|\(\d{3}\)\s?-\d{6}|\d{3}\s-\d{3}\s\d{4}/);          supportStats.push(from);          supportStats.push(to);          supportStats.push(time);          supportStats.push(subject);          supportStats.push('https://mail.google.com/mail/u/0/#inbox/'+mId); //build the URL to the email          supportStats.push(body);          if (tel){supportStats.push(tel[0])} else {supportStats.push("")};          ms.push(supportStats);        }      var lr=sheet.getLastRow();      sheet.getRange(lr+1,1,ms.length,7).setValues(ms);      //above code added by me      GmailApp.moveThreadsToArchive(threads.slice(j, j + batchSize));      };      /**       * Used to prevent to have too many calls in a short time, might not be        * necessary with a large enough batchSize       */      Utilities.sleep(`2000`);       elapsedTime = Date.now() - start;    } while (threads.length > 0 &&  elapsedTime < maxTime);    if(threads.length > 0){      /** Delete the last trigger */        var Triggers = ScriptApp.getProjectTriggers();      for (var p = 0; p < Triggers.length; p++) {        ScriptApp.deleteTrigger(Triggers[p])      }          //deleteTriggers();        /** Create a one-time new trigger */      ScriptApp      .newTrigger('batchArchiveEmail')      .timeBased()      .after(300 * 1000)      .create();      console.log(`next trigger created`)    } else {      /** Delete the last trigger */      var Triggers = ScriptApp.getProjectTriggers();      for (var p = 0; p < Triggers.length; p++) {        ScriptApp.deleteTrigger(Triggers[p])      }      console.log(`No more threads to process`);    }  }      

Windbg javascript engine: Casting from a byte array to a type

Posted: 10 Jun 2022 07:49 AM PDT

So i have the following data structure:

struct A_TYPE {  UINT64 a;  UINT32 b;  UINT32 c[16];  }  

and i have a byte array similar with the following code:

var buf = new ArrayBuffer(128);  var numbers = new Uint8Array(buf);  //code that sets the numbers array  

Now i want to create a A_TYPE object based on the numbers byte array.

I have tried doing the following but nothing worked:

//attempt 1:  var instantiatedType = host.typeSystem.marshalAs(numbers, srcName, typeName);    //attempt 2:  var type = host.getModuleType(srcName, typeName);  var instantiatedType = type.initialize(numbers) //initialize/constructor was a function i hoped to exist  

Any ideas whether this functionality is already implemented? Sounds like a basic feature but i couldn't find it

EPIC FHIR SMART Backend Services: { "error": "invalid_client", "error_description": null }

Posted: 10 Jun 2022 07:48 AM PDT

I'm trying to implement the EPIC FHIR SMART Backend Services (Backend OAuth 2.0) on go programming language.

I've created my dev account, uploaded the public key there, and selecting the backend system as the application audience.

I'm pretty sure my jwt token is correct. I've inspected it on jwt.io, the signature is correct. However, I always get this error:

{ "error": "invalid_client", "error_description": null }

I've tried other possible solutions as well such as:

  • ensuring the expiration date within the jet claim is below 5 minutes
  • placing the payload in the body with the correct content type, which is application/x-www-form-urlencoded
  • ensuring to use the sandbox client_id
  • using the correct jwt sign in method (RS384)

What should I do to resolve this issue?

Btw, I also saw several discussions on the google groups saying that it's worth to wait for one or two days after the dev account is created.

Below is my code. Appreciate the help!

var (      oauth2TokenUrl  = "https://fhir.epic.com/interconnect-fhir-oauth/oauth2/token"      sandboxClientID = "..."      privateKey      = "..."  )    // load private key  signKey, err := jwt.ParseRSAPrivateKeyFromPEM([]byte(privateKey))  So(err, ShouldBeNil)    // construct jwt claims  now := time.Now()  claims := jwt.MapClaims{      "iss": sandboxClientID,      "sub": sandboxClientID,      "aud": oauth2TokenUrl,      "jti": uuid.New().String(),             // fill with reference id      "exp": now.Add(1 * time.Minute).Unix(), // cannot be more than 5 minutes!  }  log.Info("  => claims:", utility.ToJsonString(claims))    // generate signed token using private key with RS384 algorithm  alg := jwt.SigningMethodRS384  signedToken, err := jwt.NewWithClaims(alg, claims).SignedString(signKey)  So(err, ShouldBeNil)  log.Info("  => signed token", signedToken)    // prepare api call payload  payload := map[string]string{      "grant_type":            "client_credentials",      "client_assertion_type": "urn:ietf:params:oauth:client-assertion-type:jwt-bearer",      "client_assertion":      signedToken,  }    // dispatch the api call  req := resty.New().      R().      EnableTrace().      SetFormData(payload)  res, err := req.Post(oauth2TokenUrl)  So(err, ShouldBeNil)  log.Info("  => response status:", res.StatusCode())  log.Info("  => response header:", res.Header())  log.Info("  => response body:", string(res.Body()))    // parse response  resBody := make(map[string]interface{})  err = json.Unmarshal(res.Body(), &resBody)  So(err, ShouldBeNil)  

org.apache.spark.SparkException: Job aborted due to stage failure in databricks

Posted: 10 Jun 2022 07:48 AM PDT

Sorry, for same type of question. I saw so many post in SO for stage failure. But none of those were able to resolve my issue. So I'm posting it again. I'm running in databricks,Runtime 7.3 LTS. I have a spark dataframe df2.While I'm running the command

df2.show()  

I'm getting following error message. Can you help me to resolve the issue?

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 231.0 failed 4   times, most recent failure: Lost task 0.3 in stage 231.0 (TID 6106, 10.52.98.16, executor 0): com.databricks.sql.io.FileReadException: Error while reading file  dbfs:/user/hive/warehouse/p_suggestedpricefornegotiation/part-00000-e64f3491-8afe-44a9-a55d-3495bc7a1395-c000.snappy.parquet. A file referenced in the transaction log cannot be found. This occurs when data has been manually deleted from the file system rather than using the table `DELETE` statement. For more information, see https://docs.microsoft.com/azure/databricks/delta/delta-intro#frequently-asked-questions  

Py4JJavaError                             Traceback (most recent call last)  <command-780007467828035> in <module>   ----> 1 df2.show()     /databricks/spark/python/pyspark/sql/dataframe.py in show(self, n, truncate, vertical)  382         """  383         if isinstance(truncate, bool) and truncate:  --> 384             print(self._jdf.showString(n, 20, vertical))  385         else:  386             print(self._jdf.showString(n, int(truncate), vertical))    /databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py in __call__(self, *args)  1255         answer = self.gateway_client.send_command(command)  1256         return_value = get_return_value(  -> 1257             answer, self.gateway_client, self.target_id, self.name)  1258   1259         for temp_arg in temp_args:    /databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw)   61     def deco(*a, **kw):   62         try:   ---> 63             return f(*a, **kw)   64         except py4j.protocol.Py4JJavaError as e:   65             s = e.java_exception.toString()     /databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py in get_return_value(answer,  gateway_client, target_id, name)  326                 raise Py4JJavaError(  327                     "An error occurred while calling {0}{1}{2}.\n".  --> 328                     format(target_id, ".", name), value)  329             else:  330                 raise Py4JError(    Py4JJavaError: An error occurred while calling o804.showString.  : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 231.0 failed 4    times, most recent failure: Lost task 0.3 in stage 231.0 (TID 6106, 10.52.98.16, executor 0): com.databricks.sql.io.FileReadException: Error while reading file dbfs:/user/hive/warehouse/p_suggestedpricefornegotiation/part-00000-e64f3491-8afe-44a9-a55d-3495bc7a1395-c000.snappy.parquet. A file referenced in the transaction log cannot be found. This occurs when data has been manually deleted from the file system rather than using the table `DELETE` statement. For more information, see https://docs.microsoft.com/azure/databricks/delta/delta-intro#frequently-asked-questions  at   org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.logFileNameAndThrow(FileScanRDD.scala:331)  at  org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.getNext(FileScanRDD.scala:297)  at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)  at  org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anonfun$prepareNextFile$1.apply(FileScanRDD.scala:463)  at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anonfun$prepareNextFile$1.apply(FileScanRDD.scala:451)  at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)  at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)  at  org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply$mcV$sp(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper$class.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:68)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:101)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.run(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  at java.lang.Thread.run(Thread.java:748)   Caused by: java.io.FileNotFoundException:  dbfs:/user/hive/warehouse/p_suggestedpricefornegotiation/part-00000-e64f3491-8afe-44a9-a55d-3495bc7a1395-c000.snappy.parquet  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1$$anonfun$apply$15.apply(DatabricksFileSystemV2.scala:770)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1$$anonfun$apply$15.apply(DatabricksFileSystemV2.scala:756)  at com.databricks.s3a.S3AExeceptionUtils$.convertAWSExceptionToJavaIOException(DatabricksStreamUtils.scala:108)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1.apply(DatabricksFileSystemV2.scala:756)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1.apply(DatabricksFileSystemV2.scala:756)  at com.databricks.logging.UsageLogging$$anonfun$recordOperation$1.apply(UsageLogging.scala:428)  at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)  at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:450)  at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:450)  at com.databricks.logging.UsageLogging$class.recordOperation(UsageLogging.scala:409)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:450)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.getFileStatus(DatabricksFileSystemV2.scala:755)  at com.databricks.backend.daemon.data.client.DatabricksFileSystem.getFileStatus(DatabricksFileSystem.scala:201)  at  com.databricks.spark.metrics.FileSystemWithMetrics.getFileStatus(FileSystemWithMetrics.scala:295)  at org.apache.parquet.hadoop.util.HadoopInputFile.fromPath(HadoopInputFile.java:39)  at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:452)  at com.databricks.sql.io.parquet.CachingParquetFileReader.readFooter(CachingParquetFileReader.java:366)  at org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.prepare(SpecificParquetRecordReaderBase.java:128)  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:477)  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:390)  at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.getNext(FileScanRDD.scala:281)  ... 14 more    Driver stacktrace:  at   org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:2362)  at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:2350)  at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:2349)  at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)  at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2349)  at  org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:1102)  at  org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:1102)  at scala.Option.foreach(Option.scala:257)  at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1102)  at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2582)  at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2529)  at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2517)  at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)  at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:897)  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2280)  at org.apache.spark.sql.execution.collect.Collector.runSparkJobs(Collector.scala:270)  at org.apache.spark.sql.execution.collect.Collector.collect(Collector.scala:280)  at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:80)  at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:86)  at  org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:508)  at org.apache.spark.sql.execution.CollectLimitExec.executeCollectResult(limit.scala:57)  at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectResult(Dataset.scala:2905)  at  org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3517)  at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2634)  at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2634)  at org.apache.spark.sql.Dataset$$anonfun$54.apply(Dataset.scala:3501)  at org.apache.spark.sql.Dataset$$anonfun$54.apply(Dataset.scala:3496)  at  org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1$$anonfun$apply$1.apply(SQLExecution.scala:112)  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:232)  at  org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:98)  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:835)  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:74)  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:184)  at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withAction(Dataset.scala:3496)  at org.apache.spark.sql.Dataset.head(Dataset.scala:2634)  at org.apache.spark.sql.Dataset.take(Dataset.scala:2848)  at org.apache.spark.sql.Dataset.getRows(Dataset.scala:279)  at org.apache.spark.sql.Dataset.showString(Dataset.scala:316)  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  at java.lang.reflect.Method.invoke(Method.java:498)  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)  at py4j.Gateway.invoke(Gateway.java:295)  at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)  at py4j.commands.CallCommand.execute(CallCommand.java:79)  at py4j.GatewayConnection.run(GatewayConnection.java:251)  at java.lang.Thread.run(Thread.java:748)   Caused by: com.databricks.sql.io.FileReadException: Error while reading file      dbfs:/user/hive/warehouse/p_suggestedpricefornegotiation/part-00000-e64f3491-8afe-44a9-a55d-3495bc7a1395-c000.snappy.parquet. A file referenced in the transaction log cannot be found. This occurs when data has been manually deleted from the file system rather than using the table `DELETE` statement. For more information, see https://docs.microsoft.com/azure/databricks/delta/delta-intro#frequently-asked-questions  at   org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.logFileNameAndThrow(FileScanRDD.scala:331)  at  org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.getNext(FileScanRDD.scala:297)  at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)  at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anonfun$prepareNextFile$1.apply(FileScanRDD.scala:463)  at  org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anonfun$prepareNextFile$1.apply(FileScanRDD.scala:451)  at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)  at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply$mcV$sp(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable$$anonfun$run$1.apply(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at  org.apache.spark.util.threads.SparkThreadLocalCapturingHelper$class.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:68)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:101)  at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.run(SparkThreadLocalForwardingThreadPoolExecutor.scala:104)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)  ... 1 more  Caused by: java.io.FileNotFoundException:   dbfs:/user/hive/warehouse/p_suggestedpricefornegotiation/part-00000-e64f3491-8afe-44a9-a55d-3495bc7a1395-c000.snappy.parquet  at   com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1$$anonfun$apply$15.apply(DatabricksFileSystemV2.scala:770)  at   com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1$$anonfun$apply$15.apply(DatabricksFileSystemV2.scala:756)  at  com.databricks.s3a.S3AExeceptionUtils$.convertAWSExceptionToJavaIOException(DatabricksStreamUtils.scala:108)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1.apply(DatabricksFileSystemV2.scala:756)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$getFileStatus$1.apply(DatabricksFileSystemV2.scala:756)  at com.databricks.logging.UsageLogging$$anonfun$recordOperation$1.apply(UsageLogging.scala:428)  at  com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)  at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)  at   com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:450)  at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:450)  at com.databricks.logging.UsageLogging$class.recordOperation(UsageLogging.scala:409)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:450)  at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.getFileStatus(DatabricksFileSystemV2.scala:755)  at  com.databricks.backend.daemon.data.client.DatabricksFileSystem.getFileStatus(DatabricksFileSystem.scala:201)  at  com.databricks.spark.metrics.FileSystemWithMetrics.getFileStatus(FileSystemWithMetrics.scala:295)  at org.apache.parquet.hadoop.util.HadoopInputFile.fromPath(HadoopInputFile.java:39)  at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:452)  at com.databricks.sql.io.parquet.CachingParquetFileReader.readFooter(CachingParquetFileReader.java:366)  at org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.prepare(SpecificParquetRecordReaderBase.java:128)  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:477)  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anonfun$buildReaderWithPartitionValues$1.apply(ParquetFileFormat.scala:390)  at   org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1$$anon$2.getNext(FileScanRDD.scala:281)  ... 14 more  

Selenium click event does not trigger angularjs event

Posted: 10 Jun 2022 07:48 AM PDT

I have this page where an angularjs modal-content popup, in there i fill up some fields and click save. After save is initiated, popup should dissapear an event should happen and so on.

My selenium test does all that perfectly except that when it clicks on the save button, popup dissapears but no event is triggered or saved so when i open up the window again everything is empty. I've tried stuff that i know with selenium and it still doesn't work. Can anyone help me out here?

This is the save button:

<button class="save-button" data-ng-click="onSettingsSave()" ng-hide="readOnlyMode || !canSave()">Save</button>  

Stuff i've tried:

var saveButton = driver.FindElement(By.CssSelector("button.save-button"));  saveButton.Click();    var saveButton = driver.FindElement(By.XPath(saveXpath));  saveButton.SendKeys(Keys.Enter);    ((IJavaScriptExecutor)driver).ExecuteScript("arguments[0].focus();",saveButton);  ((IJavaScriptExecutor)driver).ExecuteScript("arguments[0].click();",saveButton );  

Javascript Error undefined is not a function

Posted: 10 Jun 2022 07:48 AM PDT

var centuryGroup = grouping(ancestry, function(person){     return Math.ceil(person.died/100);  });    function grouping(array, group){    var groups = {};    array.forEach(function(element) {      var groupName = group(element);      if(groupName in groups){        groups[groupName].push(element); //the error is stated at this line      }      else{        groups[groupName] = element;      }    });    return groups;  }  

As stated at the above, the error is at

groups[groupName].push(element);  

If I remove this, it works.

And, if its defined as:

function grouping(array, group) {    var groups = {};    array.forEach(function(element) {      var groupName = group(element);      if (groupName in groups)        groups[groupName].push(element);      else        groups[groupName] = [element];    });    return groups;  }    var centuryGroup = grouping(ancestry, function(person) {    return Math.ceil(person.died / 100);  });  

There is no error in the second one. *the 2nd code is taken from a tutorial.

Things I have tried:

  1. I put the "var centuryGroup = ..." below the function grouping... and still no difference
  2. I have made sure the code is working when I commented out the error line
  3. I have removed the {} from the if and else statement and the error still there (not that I expect it to have made a difference)

Exact error is

TypeError undefined is not a function

The contents of "array" that is passed in is

{ name: "Carolus Haverbeke" }    sex:  "m"    born: 1832    died: 1905    father:   "Carel Haverbeke"    mother:   "Maria van Brussel"  { name: "Emma de Milliano" }    sex:  "f"    born: 1876    died: 1956    father:   "Petrus de Milliano"    mother:   "Sophia van Damme"  ...  

By the way, if its not supposed to have the "push" property, the 2nd code shouldn't be working. But it is and it is showing the desired result. *confused

No comments:

Post a Comment