Monday, August 9, 2021

Recent Questions - Stack Overflow

Recent Questions - Stack Overflow


Unable to import required dependencies: numpy

Posted: 09 Aug 2021 08:04 AM PDT

I'm trying to build a visual dashboard in VS Code using dash. But I can't get past this error I'm getting:

Original error was: DLL load failed: The specified module could not be found. File "C:\Users\ko14Z57\plots.py", line 1, in import pandas as pd

Exception has occurred: ImportError Unable to import required dependencies: numpy:

I'm using python 3.6.13 but I've tried this with python 3.9 and 3.8. I also tried installing\uninstalling numpy and pandas. I'm using Anaconda for my package management. Any help would be really appreciated!

VS Code code navigation doesn’t work after adding compilerOptions

Posted: 09 Aug 2021 08:04 AM PDT

In VS Code, you can navigate to the file where the original code is by Ctrl + Click.

However, it doesn't work properly after I added compilerOptions in jsconfig.json in my project.

  {    "compilerOptions": {      "baseUrl": ".",      "paths": {        "@/components/*": ["components/*"],        "@/styles/*": ["styles/*"],         ...    

Does anyone have this similar issues? Should I install vs code extension to solve this?

Mongo realm function. $push on array and $set on others

Posted: 09 Aug 2021 08:04 AM PDT

I need to update a document which has data like below (with nested arrays (books -> images ). I want to make updates to fields like say books-> favorite and also add/delete images to books->images

userid: "ABC123"  books: Array     0:Object         BookId:"082392519186"         favorite:false         images:Array                0:Object                  ImgId:"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA/wAAAI+CAMAAAA2KPHCAAAA..."                1:Object                  "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA/wAAUAl//lHTWwDyK0ECT/KUP/////..."    

Below is my code; Please check my comments against the line about the issues I am facing.

  var oInputMessages= {     ImgId: arg.ImgId  },  filter2 = {      'userid': arg.id,      'books.BookId': arg.BookId    },  updatevar =         {          $set: {   'books.$':                   {                  'BookId': arg.BookId,                  'favorite': arg.myFav,       //$push: {images: [oInputMessages]} //Error: The dollar ($) prefixed field '$push' in 'books.1.$push' is not valid for storage      // 'images': [oInputMessages]       //Always overwriting the entry. so only one entry remains                  }                 },          //$push: {'books.$': {images: oInputMessages}}    //Error: Updating the path 'books.$' would create a conflict at 'books.$'            $push: {images: oInputMessages}                     //Adds image each time just fine BUT places "images" array outside of "books" array. I want books[].images[].ImgId        },            options = {upsert: true};    var doc = user_books_collection.findOneAndUpdate(filter2, updatevar, options, function (err, data) {      if (err) {          return res.status(500).send(err);      }      if (!data) {            return res.status(404).end();      }      return res.status(200).send(data);  

Alternatively, I tried below;

I have the below two functions (one that does $set and another one that does $push on the same collection (user_books)) and they work fine stand alone. But when I wrap them both in another fucntion and call them one after the other, for some reason it is always overwriting the image with the latest one and NOT ADDING the image.

   var updateRatingsAndComments =   context.functions.execute("fnUpdateFavoriteBook", arg);  //This function does a $set and it work fine on its own  
  var uploadUserBookImage =  context.functions.execute("fnUploadUserBookImage", arg); //This function uploads the image using $push. When I call this function individually it ADDS the image each time I call this and so it is fine on its own.  

But when I wrap both the functions like below (because I want to do both operations in one shot), it is overwriting the previous image and NOT ADDING.

  fnUpdateUserRatingsAndComments    exports = async function(arg)  {      var updateFavoriteBook =   context.functions.execute("fnUpdateFavoriteBook", arg);          var uploadUserBookImage =  context.functions.execute("fnUploadUserBookImage", arg);            };    

How to retrive values from hashmap in which key is item of combobox in vaadin

Posted: 09 Aug 2021 08:04 AM PDT

I want to store and retrieve a key value associated with the comboBox. I have used only getSelectedIndex() and getSelectedItem(). This will not help my purpose as I have to get a unique key value assocaiated with the item.

Example Scenario :

India - 10, China - 15, Russia - 18. Here if 'India' is the comboBox item then '10' is its key. Likewise, 15 for china and 18 for Russia.

When India is selected I need to get the value as 10, if china 15, if Russia 18.

How can I achieve this in vaadin. could you guys guide me to do this.

JSON Serialization in C# - Object in Object

Posted: 09 Aug 2021 08:04 AM PDT

Is there possible to display all element from "UserStatus" direct in json.

From: { "UserName": "test", "UserStatus" { "Id" : "1", "Status" "status" }}

To: { "UserName": "test", "Id" : "1", "Status" "status" }

public class UserStatus   {     public int Id { get; set; }     public string Status { get; set; }  }        public class User   {     public string UserName { get; set; }     public UserStatus Status { get; set; }  }    var user = new User();      string json = JsonConvert.SerializeObject(user);  

create-react-app Plug n Play Eslint Error

Posted: 09 Aug 2021 08:04 AM PDT

Problem

Compilation Error when starting create-react-app with pnp mode in yarn.

Failed to compile You cannot require a package ("eslint-config-react-app") that is not declared in your dependencies (via "/Users/nish7/Code/webdev/playground/day365/package.json") Referenced from: /Users/nish7/Code/webdev/playground/day365/package.json

Steps to reproduce

  1. yarn create react-app . --use-pnp
  2. yarn start

Expected Outcome

Build the application successfully

package.json

{    "name": "day365",    "version": "0.1.0",    "private": true,    "installConfig": {      "pnp": true    },    "dependencies": {      "@testing-library/jest-dom": "^5.11.4",      "@testing-library/react": "^11.1.0",      "@testing-library/user-event": "^12.1.10",      "react": "^17.0.2",      "react-dom": "^17.0.2",      "react-scripts": "4.0.3",      "web-vitals": "^1.0.1"    },    "scripts": {      "start": "react-scripts start",      "build": "react-scripts build",      "test": "react-scripts test",      "eject": "react-scripts eject"    },    "eslintConfig": {      "extends": [        "react-app",        "react-app/jest"      ]    },    "browserslist": {      "production": [        ">0.2%",        "not dead",        "not op_mini all"      ],      "development": [        "last 1 chrome version",        "last 1 firefox version",        "last 1 safari version"      ]    }  }    

Note: node_modules is being created after yarn start (unexpected considering pnp mode)

Env

  • yarn -v: 1.22.11
  • node -v:16.6.1
  • npm -v:7.20.5 eslint(vscode extention): not installed

Legend not showing in R Plot

Posted: 09 Aug 2021 08:04 AM PDT

I created the following plot but my legend disappeared. It worked earlier but suddenly it won't show... I already tried restartig the session and rewritting the function but that's not changing anything.

USDT_chart = plot(btc_daily_return, lwd= 0.7, col = "darkgrey",type='l', ylab = "Returns", main="Daily returns of Tether")  lines(usdt_daily_return,lty=2,lwd=1,col="red")  lines(eurusd_daily_return, lty = 1, lwd = 0.8, col = "blue")  legend("bottomleft", inset=0.05, legend = c("USDT", "BTC", "EUR/USD"), col = c("red", "darkgrey", "blue"), lty = 1)  

Does anybody know what might cause this? Thank you!

Etcher says - it looks like this is not a bootable image; the image does not appear to contain a partition table

Posted: 09 Aug 2021 08:04 AM PDT

while creating an windows ISO on Etcher says ::

it looks like this is not a bootable image; the image does not appear to contain a partition table and mitht not be recognized or bootable by your device

CSS - justify-content

Posted: 09 Aug 2021 08:05 AM PDT

I'm trying to add header from a file to main site.

Justify-content property does not work on the main site

Main site php:

<body>      <!--    HEADER    -->      <header>            <!-- adding header -->          <?php              include ('header.php');          ?>        </header>  

Everything works except the justify-content propertry even if I commented all the code in main_site.css

Header css:

  * {      box-sizing: border-box;      margin: 0;      padding: 0;      background-color: #282C2F;        font-family: 'Source Sans Pro', sans-serif;      font-size: 16px;      color: whitesmoke;      text-decoration: none;  }    /* HEADER */  header {      display: flex;      justify-content: space-between;      align-items: center;      padding: 30px 5%; /*top, bottom - 30px, right, left - 10% */  }    

header in header file:

header in main site file

Using JS it looks very similar

  <script          src="https://code.jquery.com/jquery-3.3.1.js"          integrity="sha256-2Kok7MbOyxpgUVvAk/HJ2jigOSYS2auK4Pfzbm7uH60="          crossorigin="anonymous">      </script>            <script>               $(function(){                  $("#header").load("header.php");               });          </script>  </head>    <body>      <!--    HEADER    -->      <header>              <div id='header'>                                </div>        </header>  

JS

Add multiple Tasks to a User Story in DevOps using Client libraries

Posted: 09 Aug 2021 08:04 AM PDT

I'm trying add multiple tasks to a new user story in Azure Dev Ops and am struggling to link the newly created tasks to the user story.

I can create the user story with no problems and thought I could use the same method to create the tasks and then link via the Parent field but this doesn't seem to work.

This is the code I use the create the tasks:

var useCase = _traceContext.UseCases.Include(u => u.DeconstructionStage).FirstOrDefault(u => u.ObjID == useCaseObjID);          List<UseCaseADO> tasks;          if (useCase.DeconstructionStage.StageOrder < 4)          {              tasks = new List<UseCaseADO>               {                  { new UseCaseADO { StackRank = "1", WorkItemType = "Task", Title1="Create Test Script", Description="See Create Test Script in the Working Practices document", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Discover(Test)", Priority = "1" } },                  { new UseCaseADO { StackRank = "2", WorkItemType = "Task", Title1="Create Discovery Trace", Description="See Create Discovery Trace in the Working Practices document", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Discover(Test)", Priority = "1" } },                  { new UseCaseADO { StackRank = "3", WorkItemType = "Task", Title1="Review Use Case (Dev)", Description="See Review Use Case (Dev) in the Working Practices document", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Discover(Dev)", Priority = "1" } },                  { new UseCaseADO { StackRank = "4", WorkItemType = "Task", Title1="Review Use Case (Test)", Description="See Review Use Case (Test) in the Working Practices document", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Discover(Test)", Priority = "1" } },                  { new UseCaseADO { StackRank = "5", WorkItemType = "Task", Title1="Create and Annotate Define Traces", Description="See Create and Annotate Define Traces in the Working Practices document", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Define(Dev)", Priority = "1" } }              };            }          else          {              tasks = new List<UseCaseADO>               {                  { new UseCaseADO { ID = "", StackRank = "6", WorkItemType = "Task", Title1="Refactor Code", AssignedTo="", Description="", AcceptanceCriteria="", State="New", Tags="Deconstruct(Dev)", Priority = "1" } },                  { new UseCaseADO { ID = "", StackRank = "7", WorkItemType = "Task", Title1="Deconstruction Test", Description="", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Deconstruct(Test)", Priority = "1" } },                  { new UseCaseADO { ID = "", StackRank = "8", WorkItemType = "Task", Title1="Decommision Code", Description="", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Decommision(Dev)", Priority = "1" } },                  { new UseCaseADO { ID = "", StackRank = "9", WorkItemType = "Task", Title1="Decommision Test", Description="", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Decommision(Test)", Priority = "1" } },                  { new UseCaseADO { ID = "", StackRank = "10", WorkItemType = "Task", Title1="Decommision Data", Description="", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Decommision(Dev)", Priority = "1" } },                  { new UseCaseADO { ID = "", StackRank = "11", WorkItemType = "Task", Title1="Decommision Test", Description="", AcceptanceCriteria="", AssignedTo="", State="New", Tags="Decommision(Test)", Priority = "1" } }              };            }            VssBasicCredential credentials = new VssBasicCredential("", _personalAccessToken);          VssConnection connection = new VssConnection(new Uri(_uri), credentials);          WorkItemTrackingHttpClient workItemTrackingHttpClient = connection.GetClient<WorkItemTrackingHttpClient>();            try          {              foreach (var task in tasks)              {                  JsonPatchDocument patchDocument = GetTaskPatchDoc(userStoryID, task);                  WorkItem result = workItemTrackingHttpClient.CreateWorkItemAsync(patchDocument, _project, "Task").Result;              }          }          catch (AggregateException ex)          {              _logger.Error("Error creating user story: {0}", ex.InnerException.Message);              return false;          }  

GetTaskPatchDoc looks like this

private JsonPatchDocument GetTaskPatchDoc(int userStoryID, UseCaseADO task)      {          JsonPatchDocument patchDocument = new JsonPatchDocument();            Dictionary<string, string> values = new Dictionary<string, string>()          {              { "/fields/System.Title", task.Title1 },              { "/fields/System.Description", task.Description},              { "/fields/System.State", task.State },              { "/fields/System.Tags", task.Tags },              { "/fields/Microsoft.VSTS.Common.StackRank", task.StackRank},              { "/fields/Microsoft.VSTS.Common.Priority", task.Priority },              { "/fields/System.Parent", userStoryID.ToString() }          };            foreach (KeyValuePair<string, string> value in values)          {              patchDocument.Add(                  new JsonPatchOperation()                  {                      Operation = Operation.Add,                      Path = value.Key,                      Value = value.Value                  }             );          }            return patchDocument;      }  

The tasks are created correctly but I'm missing the link between the user story and the task.

Does anyone know how to do this via the client libraries ? I don't want to switch the code to the REST APIs as the rest of the code is working just fine.

Thanks in advance.

Using Lambda function and to continue despite Key Error

Posted: 09 Aug 2021 08:04 AM PDT

I am looking to continue to use a forloop or lambda function despite throwing a keyerrof

df_dates = ['7/1/2021, '8/1/2021', '8/2/2021']  

df_values = enter image description here

df_test = df_dates.apply(lambda x: df_values.loc[x])  

Essentially, the 7/1/2021 date in the list of dates is throwing a key error in the Loc function for df_values

Value cannot be Null. Parameter name: input error during streamreader

Posted: 09 Aug 2021 08:04 AM PDT

If I am just reading in a file, why wouldnt it know to just end if there is nothing left or if it is null? I believe the error is thrown when its trying to do the match for 'line' and there isnt any more lines. Below is the code of when it reads in the file:

    Dim Directory1 As String = "C:\Perl_Scripts\"      Dim Ext2Find As String = "850_*.txt"        If System.IO.Directory.GetFiles(Directory1, Ext2Find, IO.SearchOption.AllDirectories).Length > 0 Then          Dim Return_Val() As String = System.IO.Directory.GetFiles(Directory1, Ext2Find, IO.SearchOption.AllDirectories)          Dim i As Integer          For i = 0 To System.IO.Directory.GetFiles(Directory1, Ext2Find, IO.SearchOption.AllDirectories).Length - 1              'MsgBox(Ext2Find & " was found! " & Return_Val(i))                Using sr As StreamReader = New StreamReader("C:\\Perl_scripts\\850_7703844115CH_54550937_20190218_062619.txt")                  Dim line As String                  ' Read and display lines from the file until the end of                    ' the file is reached.                   line = sr.ReadLine()                  While (line <> Nothing)                      line = sr.ReadLine()                      If Regex.IsMatch(line, "\b40\S*") Then  'Looks for Menard files                          MsgBox(line)                      Else                          MsgBox("THERES NO MATCH")                      End If                  End While              End Using            Next i      Else          MsgBox(Ext2Find & " was not found!")      End If  

Let Me know if you need to see the whole sub. Thanks in advance!

Padding on span element push it above container

Posted: 09 Aug 2021 08:05 AM PDT

I have two questions:

  1. Why does padding on inline elements work I have read that padding on inline element should not work.
  2. Why does the padding push the span above the div container:

http://jsfiddle.net/L76e208g/1/

body > div {      width: 150px;      height: 150px;      background: black;      margin-top: 40px;      display: inline-block;      vertical-align: bottom;  }    div > div,  div > span{      border-radius: 5px;      padding: 10%;      background: red;  }
<div>    <div>    </div>  </div>  <div>    <span>    </span>  </div>

Angular PrimeNG CSS p-table pSortableColumn, enable word-wrap break-word is not working

Posted: 09 Aug 2021 08:04 AM PDT

I am using a primeNg table with columns headers which needs to have break-word into two line. So far, I tried css like the following

.ui-dataTable thead h {    word-wrap: break-word !important;  }    .break-word {    word-wrap: break-word !important;  }  

HTML code

  <ng-template pTemplate="header">                      <tr>                                                     <th class="break-word" *ngFor="let column of columns" pSortableColumn={{column.field}}>{{column.name}} <p-sortIcon field={{column.field}}></p-sortIcon></th>                      </tr>                  </ng-template>  

I even went to the forum.primefaces.org but that css suggestion did not work.

Thanks for any help.

How can I use Axios interceptors to add some headers to responses?

Posted: 09 Aug 2021 08:04 AM PDT

In my Reactjs app , I want to add an interceptor which can append some headers to some backend responses

So, I tried this :

    export default function App() {        axios.interceptors.response.use(          (config)=> {            config.headers['myheader'] = 'myvalue'; // <-- THIS IS MY CUSTOM HEADERS            return config;          },          (error) => {            // ON ERREOR          })         ......        );  

And I suppose like that that my header would be append in every back-end response. But that doesn't seem to work.

Suggestions ??

Print a different element from list but with same text and increasing number?

Posted: 09 Aug 2021 08:04 AM PDT

I want to print it starting from "Francisco" and ending with "Antonio", but keeping text with the numbers increasing, I really don't know where to go with this, I'm very new, thank you for helping!

list = ['Francisco', 'Marcelle', 'Olivia', 'José', 'Guilherme', 'Maria', 'Arthur', 'Lucas', 'Lurdes', 'Antonio']        for x in range(1,11) :      print(list[x] "your number on the list is, " + str(x) + ".")   

In the last line I found how to keep text, add increasing number but I don't know how to print different element with range, for, I started programming not long

Example: Francisco, your number is 1, Marcelle, your number is 2, Antonio, your number is 10

Register form similar to facebook once CSS

Posted: 09 Aug 2021 08:05 AM PDT

I have a register component which contain the red wrapper and blue form. It opens when you press the button in the navigation.

The problem is that I want to scroll just over the red component when the register form is opend, not over the whole site. So when the screen is big enough, you'll see the whole register form, but when you have to small screen you will be able to scroll it. It should looks like the register form on facebook when you click 'create new account'.

The red Wrapper component covers only part of the page so as to fit the register form inside. Black border on the photo means the screen.

enter image description here

My register component code looks like this, but I don't know how to make scrolling just over the "red" component. The "blue" register form should stay in place when is scrolling. How can I do this correctly?

const Wrapper = styled.div`    min-height: 100vh;    width: 100%;    background-color: rgba(0, 0, 0, 0.7);    position: absolute;    z-index: 999999;    display: flex;    justify-content: center;    padding: 30px 0;  `;    const Form = styled.form`    background-color: white;    padding: 50px 80px;    position: relative;    box-shadow: rgba(99, 99, 99, 0.2) 0px 2px 8px 0px;    border-radius: 10px;  `;    const StyledInput = styled.input`    padding: 10px 12px;    display: block;    border: 1px solid ${({ error }) => (error ? "red" : "#d6d6d6")};    outline: none;    width: 250px;      :focus {      border: 1px solid #00b4ff;      color: #00b4ff;    }  `;    const LoginButton = styled.button`    background-color: ${({ facebook }) => (facebook ? "#4c69ba" : "#191919")};    color: white;    width: 100%;    padding: 10px 0;    text-align: center;    cursor: pointer;    border-radius: 10px;    border: 0;    display: block;      :hover {      opacity: 0.9;    }      :last-child {      margin-top: 10px;    }  `;    const Logo = styled.img`    width: 150px;    height: 22px;  `;    const LogoHolder = styled.div`    width: 100%;    display: flex;    justify-content: center;  `;    const FieldType = styled.div`    font-size: 13px;    padding-bottom: 2px;    margin-top: 15px;  `;    const CLose = styled(IoMdClose)`    right: 10px;    top: 10px;    cursor: pointer;    position: absolute;    color: #191919;      :hover {      color: #00b4ff;    }  `;    const StyledCheckbox = styled(Checkbox)`    margin: 15px 0;  `;    const Error = styled.div`    font-size: 11px;    color: red;    margin-top: ${({ statute }) => (statute ? "-10px" : "3px")};    margin-bottom: ${({ statute }) => (statute ? "15px" : "0")};    width: 250px;  `;    const Register = ({ setRegisterView }) => {    return (      <Wrapper>          <Form>            <CLose onClick={() => setRegisterView(false)} />            <LogoHolder>              <Logo src={logo} />            </LogoHolder>            <FieldType>First Name</FieldType>            <StyledInput              name="first_name"              type="text"            />            <FieldType>Last Name</FieldType>            <StyledInput              name="last_name"              type="text"            />            <FieldType>City</FieldType>            <StyledInput              name="city"              type="text"            />            <FieldType>Email</FieldType>            <StyledInput              name="email"              type="email"            />            <FieldType>Password</FieldType>            <StyledInput              name="password"              type="password"            />            <FieldType>Repeat password</FieldType>            <StyledInput              name="repeated_password"              type="password"            />            <StyledCheckbox              text="Accept statute."            />            <LoginButton type="submit">SIGN UP</LoginButton>            <LoginButton facebook>LOGIN WITH FACEBOOK</LoginButton>          </Form>      </Wrapper>    );  };    export default Register;  

How to use Model-Based Projection of Z3 from C/C++?

Posted: 09 Aug 2021 08:04 AM PDT

I want to use the Model-Based Projection that comes with Z3/Spacer from C/C++, I noticed Z3_qe_model_project in "z3_spacer.h", However after multiple experiments, I didn't figure out how to use it since there is no documentation about it.

The question is: How to use Model-Based Projection of Z3 from C/C++?

Ansible search and query

Posted: 09 Aug 2021 08:05 AM PDT

Updated with suggestions from larsks. With the following structure

    "intf_output_ios": {          "ansible_facts": {              "discovered_interpreter_python": "/usr/bin/python3"          },          "changed": false,          "failed": false,          "gathered": [              {                  "name": "GigabitEthernet0/0"              },              {                  "mode": "trunk",                  "name": "GigabitEthernet0/1",                  "trunk": {                      "allowed_vlans": [                          "10",                          "20",                          "30",                          "99",                          "100"                      ],                      "encapsulation": "dot1q"                  }              },              {                  "mode": "trunk",                  "name": "GigabitEthernet0/2",                  "trunk": {                      "allowed_vlans": [                          "10",                          "20",                          "30",                          "99",                          "100"                      ],                      "encapsulation": "dot1q"                  }              },              {                  "access": {                      "vlan": 30                  },                  "mode": "access",                  "name": "GigabitEthernet0/3"              },              {                  "name": "GigabitEthernet1/0"              },              {                  "name": "GigabitEthernet1/1"              },              {                  "name": "GigabitEthernet1/2"              },              {                  "name": "GigabitEthernet1/3"              },              {                  "name": "GigabitEthernet2/0"              },              {                  "name": "GigabitEthernet2/1"              },              {                  "name": "GigabitEthernet2/2"              },              {                  "name": "GigabitEthernet2/3"              },              {                  "name": "GigabitEthernet3/0"              },              {                  "name": "GigabitEthernet3/1"              },              {                  "name": "GigabitEthernet3/2"              },              {                  "access": {                      "vlan": 99                  },                  "mode": "access",                  "name": "GigabitEthernet3/3"              }          ]      }  

To print only the ports in VLAN 30 use the following?

- name: "P901T6: Set fact to include only access ports - IOS"    set_fact:      access_ports_ios_2: "{{ intf_output_ios | json_query(query) }}"    vars:      query: >-        gathered[?access.vlan==`30`]    - name: "P901T7: Dump list of access ports - IOS"    debug:      var=access_ports_ios_2  

NOTE: It is important to use 30 (with backticks) and not '30'

I have gone through https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html#managing-list-variables without really understanding how to fix this. If someone has some good link that would be very useful

With a structure like

ok: [access01] => {      "access_ports_ios": [          {              "access": {                  "vlan": 30              },              "mode": "access",              "name": "GigabitEthernet0/3"          },          {              "access": {                  "vlan": 99              },              "mode": "access",              "name": "GigabitEthernet3/3"          }      ]  }  

To get ports in vlan 30 use:

- debug:      var: access_ports_ios|json_query(query)    vars:      query: >-        [?access.vlan==`30`]  

Asp.net core Identity Password reset "invalid token" even though tokens generated are identical

Posted: 09 Aug 2021 08:04 AM PDT

I am on Asp.Net core with .Net 5.0 In startup I have added;

        services.AddIdentity<ApplicationUser, ApplicationRole>(SetupIdentityOptions)              .AddEntityFrameworkStores<ApplicationDbContext>()              .AddDefaultTokenProviders();                    services.Configure<DataProtectionTokenProviderOptions>(opt =>               {                  opt.TokenLifespan = TimeSpan.FromDays(1);              }          );  

I generate the code as below(encoding done as per the MS docs);

            var code = await CommonServices.UserManager.GeneratePasswordResetTokenAsync(user);              code = WebEncoders.Base64UrlEncode(Encoding.UTF8.GetBytes(code));                var callbackUrl = $"{request.RequestData.ReturnUrl}?code={code}";  

Now if I save the code generated at this point and check the code received to controller after user clicks the link they are identical. However when I do;

var result = await CS.UserManager.ResetPasswordAsync(user, model.Code, model.Password);  

it gives me "Invalid Token" however tokens are identical.

Then I tried this and still says invalid token;

var isValidToken = await _userManager.VerifyUserTokenAsync(  user,  _userManager.Options.Tokens.PasswordResetTokenProvider,  UserManager<TUser>.ResetPasswordTokenPurpose,  code);  

I am not sure whether the Identity framework saves the generated tokens in [AspNetUserTokens] table. I can't find any saved tokens after a generating the token.

What may be the problem?

efficient way to join 65,000 .csv files

Posted: 09 Aug 2021 08:04 AM PDT

I have say 65,000 .csv files that I need to work with in julia language.

The goal is to perform basic statistics on the data set.

I had some ways of joining all the data sets

#1 - set a common index and leftjoin() - perform statistics row wise   #2 - vcat() the dataframes on top of each other - vertically stacked use group by  

Eitherway the final data frames are very large ! and become slow in processing

Is there an efficient way of doing this ?

I thought of performing either #1 or #2 and splitting the joining operations in thirds, lets say after 20,000 joins save to .csv and operate in chunks then at the end join all 3 in one last operation.

Well not sure how to replicate making 65k .csv files but basically below I loop through the files in the directory, load the csv the vcat() to one df. Question more relating to if there is a better way to manage the size of the operation.

# read all files in directory  csv_dir_tmax = cd(readdir, "C:/Users/andrew.bannerman/Desktop/Julia/scripts/GHCN data/ghcnd_all_csv/tmax")    # initialize outputs  tmax_all = DataFrame(Date = [], TMAX = [])  c=1  for c = 1:length(csv_dir_tmax)      print("Starting csv file ", csv_dir_tmax[c]," - Iteration ",c,"\n")          if c <= length(csv_dir_tmax)      csv_tmax = CSV.read(join(["C:/Users/andrew.bannerman/Desktop/Julia/scripts/GHCN data/ghcnd_all_csv/tmax/", csv_dir_tmax[c]]), DataFrame, header=true)          tmax_all = vcat(tmax_all, csv_tmax)      end  end  

How do I save the data that has been randomly undersampled?

Posted: 09 Aug 2021 08:05 AM PDT

I am trying to balance a data frame by using random undersampling of the majority class. It has been successful, however, I also want to save the data that has been removed from the data frame (undersampled) to a new data frame. How do I accomplish this?

This is the code that I am using to undersample the data frame

from imblearn.under_sampling import RandomUnderSampler    rus = RandomUnderSampler(sampling_strategy=1)  X_res, y_res = rus.fit_resample(X, y)    df1 = pd.concat([X_res, y_res], axis=1)  

bootstrap collapse does not work correctly ,closes immediately after opening

Posted: 09 Aug 2021 08:04 AM PDT

i have a table tag which i want to open it when the button is clicked but it opens it but close it immediately

        <button class="btn btn-primary"            "data-toggle="collapse"            href="#collapseExample"            aria-expanded="false">Get rate</button>  

here is my table :

            <table  class="table table-hover table-condensed collapse" id="collapseExample">              <thead>                <tr>                  <th style="width:50%"></th>                </tr>              </thead>              <tbody>                <tr>                  <th scope="row">You Pay</th>                  <td>{{total}} SEK</td>                </tr>                <tr>                  <th scope="row">Price</th>                    <td>{{price}} {{currency}}</td>                </tr>                <tr>                  <th scope="row">Rate</th>                  <td colspan="2">{{rate}}</td>                  </tr>              </tbody>            </table>  

I need help for downgrade MariaDB from 10.2.40 to 5.5.68

Posted: 09 Aug 2021 08:04 AM PDT

Due to a backend application I need MariaDB 5.5.68. I have installed CentOS Web Panel and after with yum remove mariadb, yum remove MariaDB-common and rm -rf /var/lib/mysql. rm /etc/my.cnf I have installed MariaDB 5.5.68.

The problem now is that when I write mysql I get this error:

"ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)".      [root@server720546 ~]# systemctl start mariadb      Failed to start mariadb.service: Unit not found.  

UPDATE

[root@server720546 mariadb]# systemctl status mariadb  ● mariadb.service - MariaDB database server     Loaded: loaded (/usr/lib/systemd/system/mariadb.service; disabled; vendor preset: disabled)    Drop-In: /etc/systemd/system/mariadb.service.d             └─migrated-from-my.cnf-settings.conf     Active: failed (Result: exit-code) since Sat 2021-08-07 14:22:22 UTC; 5min ago    Process: 1672 ExecStartPost=/usr/libexec/mariadb-wait-ready $MAINPID (code=exited, status=1/FAILURE)    Process: 1671 ExecStart=/usr/bin/mysqld_safe --basedir=/usr (code=exited, status=0/SUCCESS)    Process: 1637 ExecStartPre=/usr/libexec/mariadb-prepare-db-dir %n (code=exited, status=0/SUCCESS)   Main PID: 1671 (code=exited, status=0/SUCCESS)    Aug 07 14:22:21 server720546.cloudsons.net systemd[1]: Starting MariaDB database server...  Aug 07 14:22:21 server720546.cloudsons.net mariadb-prepare-db-dir[1637]: Database MariaDB is probably initialized in /var/lib/mysql already, nothing is done.  Aug 07 14:22:21 server720546.cloudsons.net mysqld_safe[1671]: 210807 14:22:21 mysqld_safe Logging to '/var/log/mariadb/mariadb.log'.  Aug 07 14:22:21 server720546.cloudsons.net mysqld_safe[1671]: 210807 14:22:21 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql  Aug 07 14:22:22 server720546.cloudsons.net systemd[1]: mariadb.service: control process exited, code=exited status=1  Aug 07 14:22:22 server720546.cloudsons.net systemd[1]: Failed to start MariaDB database server.  Aug 07 14:22:22 server720546.cloudsons.net systemd[1]: Unit mariadb.service entered failed state.  Aug 07 14:22:22 server720546.cloudsons.net systemd[1]: mariadb.service failed.  

From mariadb.log:

  210807 14:22:21 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql  210807 14:22:21 [Note] /usr/libexec/mysqld (mysqld 5.5.68-MariaDB) starting as process 1836 ...  210807 14:22:21 InnoDB: The InnoDB memory heap is disabled  210807 14:22:21 InnoDB: Mutexes and rw_locks use GCC atomic builtins  210807 14:22:21 InnoDB: Compressed tables use zlib 1.2.7  210807 14:22:21 InnoDB: Using Linux native AIO  210807 14:22:21 InnoDB: Initializing buffer pool, size = 128.0M  210807 14:22:21 InnoDB: Completed initialization of buffer pool  InnoDB: Error: checksum mismatch in data file ./ibdata1  210807 14:22:21 InnoDB: Could not open or create data files.  210807 14:22:21 InnoDB: If you tried to add new data files, and it failed here,  210807 14:22:21 InnoDB: you should now edit innodb_data_file_path in my.cnf back  210807 14:22:21 InnoDB: to what it was, and remove the new ibdata files InnoDB created  210807 14:22:21 InnoDB: in this failed attempt. InnoDB only wrote those files full of  210807 14:22:21 InnoDB: zeros, but did not yet use them in any way. But be careful: do not  210807 14:22:21 InnoDB: remove old data files which contain your precious data!  210807 14:22:21 [ERROR] Plugin 'InnoDB' init function returned error.  210807 14:22:21 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed.  210807 14:22:21 [Note] Plugin 'FEEDBACK' is disabled.  210807 14:22:21 [ERROR] Unknown/unsupported storage engine: InnoDB  210807 14:22:21 [ERROR] Aborting    210807 14:22:21 [Note] /usr/libexec/mysqld: Shutdown complete    210807 14:22:21 mysqld_safe mysqld from pid file /var/run/mariadb/mariadb.pid ended    

UPDATE 2

With journalctl -xe I have this:

[root@server720546 ~]# journalctl -xe  --  -- The start-up result is done.  Aug 07 14:52:05 server720546.cloudsons.net systemd[1]: Started Session 6 of user  -- Subject: Unit session-6.scope has finished start-up  -- Defined-By: systemd  -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel  --  -- Unit session-6.scope has finished starting up.  --  -- The start-up result is done.  Aug 07 14:52:05 server720546.cloudsons.net systemd-logind[482]: New session 6 of  -- Subject: A new session 6 has been created for user root  -- Defined-By: systemd  -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel  -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat  --  -- A new session with the ID 6 has been created for the user root.  --  -- The leading process of the session is 2394.  Aug 07 14:52:05 server720546.cloudsons.net sshd[2394]: pam_unix(sshd:session): s  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Failed password for root  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Received disconnect from  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Disconnected from 198.23.  lines 1206-1228/1228 (END)  --  -- The start-up result is done.  Aug 07 14:52:05 server720546.cloudsons.net systemd[1]: Started Session 6 of user root.  -- Subject: Unit session-6.scope has finished start-up  -- Defined-By: systemd  -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel  --  -- Unit session-6.scope has finished starting up.  --  -- The start-up result is done.  Aug 07 14:52:05 server720546.cloudsons.net systemd-logind[482]: New session 6 of user root.  -- Subject: A new session 6 has been created for user root  -- Defined-By: systemd  -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel  -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat  --  -- A new session with the ID 6 has been created for the user root.  --  -- The leading process of the session is 2394.  Aug 07 14:52:05 server720546.cloudsons.net sshd[2394]: pam_unix(sshd:session): session opened for user root by (uid=0)  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Failed password for root from 198.23.233.28 port 57126 ssh2  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Received disconnect from 198.23.233.28 port 57126:11: Bye Bye [preauth]  Aug 07 14:52:06 server720546.cloudsons.net sshd[2396]: Disconnected from 198.23.233.28 port 57126 [preauth]    

'FailedExecution' error when executing workitem

Posted: 09 Aug 2021 08:05 AM PDT

Prior to the changes to the Engine version, my design automation for Inventor app was working fine. After changing to a newer Engine version, I have started to get error when executing the workitems. The reason I changed the engine was because my workitem for newer inventor files (> 2018) are not processed but those that are from 2018 and older were working fine.

So, I changed 'Autodesk.Inventor+23' to 'Autodesk.Inventor+24' and even to 'Autodesk.Inventor+2021' (current). After that, all my workitems are running into "FailedExecution" Error.

Here are the WorkItem details:

  • workitem Id : f4ff1446ac994007a8cfe3ba565caabf

My guess is the my addin is not loading correctly, but I am not sure how to verify that. Any help in finding the cause of the issue is appreicated.

Bot works locally but unresponsive when deployed to Azure Function

Posted: 09 Aug 2021 08:04 AM PDT

I've created a bot using Bot Framework and deploy it to Azure Function. When I run the bot locally, it responds to every message correctly. But after I deploy it to Azure Function and changed the message endpoint using the URL that I got from Azure Functions, the bot doesn't respond at all.

  1. Created a bot using Node.js. Here's the azure function code:

    module.exports = async function(context, req) {    context.log('JavaScript HTTP trigger function processed a request.');    context.log('context: ', context.bindings.req.body);    adapter.processActivity(req, context.res, async (ctx) => {      // Route to main dialog.      await bot.run(ctx);    });  }  
  2. It works fine locally

  3. Deployed it to Azure Functions and activate streaming logs in Visual Code Studio

  4. Generate URL Azure Functions for Bot Messaging Endpoint

  5. Change bot messaging endpoint in bot's configuration

  6. Send message to bot via channel Microsoft Teams, it looks like it works because the logs from context.log are shown in stream logs, but I do not receive any messages back

What could be wrong?

best way to iterate through elements of pandas Series

Posted: 09 Aug 2021 08:04 AM PDT

All of the following seem to be working for iterating through the elements of a pandas Series. I'm sure there's more ways of doing it. What are the differences and which is the best way?

import pandas      arr = pandas.Series([1, 1, 1, 2, 2, 2, 3, 3])    # 1  for el in arr:      print(el)    # 2  for _, el in arr.iteritems():      print(el)    # 3  for el in arr.array:      print(el)    # 4  for el in arr.values:      print(el)    # 5  for i in range(len(arr)):      print(arr.iloc[i])  

Writing a pydantic object into a sqlalchemy json column

Posted: 09 Aug 2021 08:04 AM PDT

I'm looking for a way to have a pydantic object stored in a sqlalchemy json column. My attempts so far are being tripped up by a datetime field in the pydantic object. I feel like I'm missing something obvious.

My first attempt was to simply serialise the result of .dict(). But this doesn't convert datetime objects to strings so the serialiser falls over. If I convert with .json then the result is a string and what's stored in the database is the json of a string not a dict.

import sqlalchemy.orm  from pydantic import BaseModel  from datetime import datetime    mapper_registry = sqlalchemy.orm.registry()  Base = mapper_registry.generate_base()      class _PydanticType(sqlalchemy.types.TypeDecorator):      impl = sqlalchemy.types.JSON        def __init__(self, pydantic_type):          super().__init__()          self._pydantic_type = pydantic_type        def process_bind_param(self, value, dialect):          return value.dict()        def process_result_value(self, value, dialect):          return self._pydantic_type.parse_obj(value)      class Test(BaseModel):      timestamp: datetime      class Foo(Base):      __tablename__ = 'foo'      x = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)      y = sqlalchemy.Column(_PydanticType(Test))      engine = sqlalchemy.create_engine('sqlite:///x.db', echo=True)  mapper_registry.metadata.create_all(bind=engine)  session = sqlalchemy.orm.sessionmaker(bind=engine)()  session.add(Foo(x=1, y=Test(timestamp=datetime.now())))  session.commit()  
sqlalchemy.exc.StatementError: (builtins.TypeError) Object of type datetime is not JSON serializable  

How to setup a listener for Firebase connectivity in Flutter?

Posted: 09 Aug 2021 08:04 AM PDT

So I'm trying to catch a 'No Network' error.

I'm listening to a document in Firestore like this:

final docRef = FirebaseFirestore.instance.collection(kInfoCollection).doc(kInfoDocument);    docSubscription = docRef.snapshots().distinct().listen((_) {});    docSubscription.onData((snapshot) => _processInfo(snapshot))    docSubscription.onError((e, s) {          final errorMessage = 'Message:\n$e \nThe Stack was:\n$s';          print('Error $errorMessage');          throw InfoException(errorMessage);        });    

So as you can see, the listener is listening to a document and will throw an error if it catches any.

But these Firebase errors as shown in the console wasn't caught:

Connection 1: encountered error(1:53)  6.26.0 - [Firebase/Firestore][I-FST000001] WatchStream (283fb4118) Stream error: 'Unavailable: Network connectivity changed'  6.26.0 - [Firebase/Firestore][I-FST000001] WatchStream (283fb4118) Stream error: 'Unavailable: DNS resolution failed'  6.26.0 - [Firebase/Firestore][I-FST000001] WatchStream (283fb4118) Stream error: 'Unavailable: DNS resolution failed'  6.26.0 - [Firebase/Firestore][I-FST000001] WatchStream (283fb4118) Stream error: 'Unavailable: failed to connect to all addresses'  

Am I missing something?

Additional info: I simulate this by going to Airplane mode on my phone. Otherwise there is no error.

Override text rendering in Visual Studio?

Posted: 09 Aug 2021 08:04 AM PDT

I want to customize the way the text is rendered in Microsoft's Visual Studio text viewer. The goal is to implement my own complicated algorithm for whitespace handling.

Note: I've already written "add-ins" to automate some tasks (include guards, copyright insertion in the file's header etc...). However, as far as I understand, "add-ins" cannot be used to override the kind of functionality I need.

Is it possible to override the default text rendering? If yes, where do I start?

(Visual studio 2005 is preferred.)

No comments:

Post a Comment