Sunday, August 15, 2021

Recent Questions - Stack Overflow

Recent Questions - Stack Overflow


How to make a function in R to recode a variable into new binary columns? (with ifelse statement)

Posted: 15 Aug 2021 08:06 AM PDT

I want to make a function to recode values within a variable to binary 0 and 1 using ifelse. Suppose I have this dataset:

df <- data.frame(          id = 1:10,          region = rep(c("Asia", "Africa", "Europe", "America"), length = 10)          )  

This is the output that I want:

enter image description here

However, I want to create those columns by using function, so I just have to enter the data and variable to the function. This is as far as I get:

binary <- function(data2, var, value){          for(i in 1:nrow(data2)){                      val <- ifelse(data2[data2[var] == value, 1, 0)              data2 <- cbind(data2, val)              seval <- (data2$val)*(data2$sens)              spval <- (data2$val)*(data2$spec)              }          }  

Does anyone know how to use the ifelse function within the for loop and function in R? Any help is much appreciated. Thank you.

Add "submethods" to classes for better organization?

Posted: 15 Aug 2021 08:06 AM PDT

I just started to use TypeScript and classes and I would like to know if it is possible to add "submethods" to an existing class, I mean by that that I have a main class called Mongo, currently I have like 30 static methods inside it but would like to split them in categories (and files) such as find, insert, delete, update and access them with Mongo.category.method ?

Here is what one of my static methods looks like

class Mongo {    static db: Db;      static async connect() {      const URL = `mongodb://${config.MONGODB_URI}:${config.MONGODB_PORT}`;      const client = await MongoClient.connect(URL, MONGODB_OPTIONS);      Mongo.db = client.db(config.MONGODB_DATABASE);      log.info('Connected to MongoDB!');    }      static async findUserById(channelId: string, userId: string): Promise<IUserLog | undefined> {      return await Mongo.db.collection(`logger_${channelId}`).findOne({ _id: userId });    }  }  

This method after my example would be idealy accessed like Mongo.find.userById

But is this even possible ?

Thanks a lot, and sorry in advance if code isn't well written

Change appBar in for each button clicked in flutter

Posted: 15 Aug 2021 08:06 AM PDT

Is there any way that I can change AppBar's 'title:' that will be based to my BottomNavigationBar's button label? I am building an app where the navigation bar will call each classes' on button click,

like this maybe?

appbar: AppBar(  title: SelectedIndex(label/tile),  ),  

here's the sour code:

    import 'package:flutter/material.dart';    import 'BoosterCommunity_Page.dart';  import 'Diary_Page.dart';  import 'GradeTracker_Page.dart';  import 'CalendarView_Page.dart';  import 'QuotesPage.dart';  import 'ListView_Page.dart';    class HomePage extends StatefulWidget{    HomePage({Key? key}): super(key: key);      @override    _HomePageState createState() => _HomePageState();  }    class _HomePageState extends State<HomePage>{    PageController _pageController = PageController();    List<Widget> _screens = [      QuotesPage(), ListViewPage(), CalendarViewPage(), GradeTrackerPage(), DiaryPage(), BoosterCommunityPage(),    ];      void _onPageChanged(int index){}    void _onItemsTapped(int selectedIndex){      _pageController.jumpToPage(selectedIndex);    }      @override    Widget build(BuildContext context){      return Scaffold(        appBar: AppBar(          centerTitle: true,      //I want it to be implimented in this line            title: (BottomNavBar selected index title or label),        ),        body: PageView(          controller: _pageController,          children: _screens,          onPageChanged: _onPageChanged,          physics: NeverScrollableScrollPhysics(),        ),        bottomNavigationBar: BottomNavigationBar(          onTap: _onItemsTapped,          items: [            BottomNavigationBarItem(              icon: Icon(Icons.home, color: Colors.grey,),              label: 'Home',            ),            BottomNavigationBarItem(              icon: Icon(Icons.list, color: Colors.grey,),              label: 'Task List',            ),            BottomNavigationBarItem(              icon: Icon(Icons.calendar_view_month, color: Colors.grey,),              label: 'Calendar View',            ),            BottomNavigationBarItem(              icon: Icon(Icons.grade, color: Colors.grey,),              label: 'Grade Tracker',            ),            BottomNavigationBarItem(              icon: Icon(Icons.book, color: Colors.grey,),              label: 'Diary Page',            ),            BottomNavigationBarItem(              icon: Icon(Icons.business, color: Colors.grey,),              label: 'Booster Community',            ),                                                          ],        ),              drawer: Drawer(          child: ListView(            padding: EdgeInsets.zero,            children: [              Container(                height: 100.0,                child: const DrawerHeader(                decoration: BoxDecoration(                color: Colors.orange,                ),                child: Text('Sign in first'),              ),              ),              ListTile(                title: const Text('Account'),                onTap: () {                  Navigator.pop(context);                },              ),              ListTile(                title: const Text('Settings'),                onTap: () {                  Navigator.pop(context);                },              ),              ListTile(                title: const Text('Help and Support'),                onTap: (){                  Navigator.pop(context);                },              ),            ],          ),        ),        );    }  }  

is it possible or is there an easy way? please let me know, thank you in advance.

Reactor Netty websocket channel closed prematurely

Posted: 15 Aug 2021 08:05 AM PDT

I have a long running websocket client implemented in java Spring reactor with Netty targeting Binance ws api. According to specs, the weboscket channel is kept open 24 hours.

The websocket is unexpectedly and prematurely closed after around 3 minutes :

16:50:48.418 [main] DEBUG reactor.util.Loggers - Using Slf4j logging framework  16:50:48.434 [main] DEBUG io.netty.util.internal.logging.InternalLoggerFactory - Using SLF4J as the default logging framework  16:50:48.436 [main] DEBUG io.netty.util.internal.PlatformDependent0 - -Dio.netty.noUnsafe: false  16:50:48.437 [main] DEBUG io.netty.util.internal.PlatformDependent0 - Java version: 14  16:50:48.438 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available  16:50:48.438 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available  16:50:48.438 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Buffer.address: available  16:50:48.439 [main] DEBUG io.netty.util.internal.PlatformDependent0 - direct buffer constructor: unavailable: Reflective setAccessible(true) disabled  16:50:48.439 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: available, true  16:50:48.440 [main] DEBUG io.netty.util.internal.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @1efbd816  16:50:48.440 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable  16:50:48.440 [main] DEBUG io.netty.util.internal.PlatformDependent - sun.misc.Unsafe: available  16:50:48.448 [main] DEBUG io.netty.util.internal.PlatformDependent - maxDirectMemory: 8388608000 bytes (maybe)  16:50:48.448 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.tmpdir: /tmp (java.io.tmpdir)  16:50:48.448 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)  16:50:48.449 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes  16:50:48.450 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1  16:50:48.450 [main] DEBUG io.netty.util.internal.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available  16:50:48.450 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false  16:50:48.460 [main] DEBUG reactor.netty.tcp.TcpResources - [http] resources will use the default LoopResources: DefaultLoopResources {prefix=reactor-http, daemon=true, selectCount=8, workerCount=8}  16:50:48.460 [main] DEBUG reactor.netty.tcp.TcpResources - [http] resources will use the default ConnectionProvider: reactor.netty.resources.DefaultPooledConnectionProvider@192b07fd  16:50:48.485 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple  16:50:48.486 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4  16:50:48.581 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false  16:50:48.581 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false  16:50:48.582 [main] DEBUG io.netty.util.NetUtilInitializations - Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)  16:50:48.583 [main] DEBUG io.netty.util.NetUtil - /proc/sys/net/core/somaxconn: 128  16:50:48.590 [main] DEBUG org.springframework.web.reactive.socket.client.ReactorNettyWebSocketClient - Connecting to wss://stream.binance.com:9443/ws  16:50:48.601 [main] DEBUG io.netty.handler.ssl.OpenSsl - netty-tcnative not in the classpath; OpenSslEngine will be unavailable.  16:50:48.712 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default protocols (JDK): [TLSv1.3, TLSv1.2, TLSv1.1, TLSv1]   16:50:48.712 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default cipher suites (JDK): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_AES_128_GCM_SHA256, TLS_AES_256_GCM_SHA384]  16:50:48.720 [main] DEBUG reactor.netty.resources.DefaultLoopIOUring - Default io_uring support : false  16:50:48.724 [main] DEBUG io.netty.util.internal.NativeLibraryLoader - -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)  16:50:48.725 [main] DEBUG io.netty.util.internal.NativeLibraryLoader - -Dio.netty.native.deleteLibAfterLoading: true  16:50:48.725 [main] DEBUG io.netty.util.internal.NativeLibraryLoader - -Dio.netty.native.tryPatchShadedId: true  16:50:48.730 [main] DEBUG io.netty.util.internal.NativeLibraryLoader - Successfully loaded the library /tmp/libnetty_transport_native_epoll_x86_6410359104745093945181.so  16:50:48.731 [main] DEBUG reactor.netty.resources.DefaultLoopEpoll - Default Epoll support : true  16:50:48.734 [main] DEBUG io.netty.channel.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 16  16:50:48.742 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024  16:50:48.743 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096  16:50:48.749 [main] DEBUG io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available  16:50:48.768 [main] DEBUG reactor.netty.resources.PooledConnectionProvider - Creating a new [http] client pool [PoolFactory{evictionInterval=PT0S, leasingStrategy=fifo, maxConnections=500, maxIdleTime=-1, maxLifeTime=-1, metricsEnabled=false, pendingAcquireMaxCount=1000, pendingAcquireTimeout=45000}] for [stream.binance.com/<unresolved>:9443]  16:50:48.798 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.processId: 27223 (auto-detected)  16:50:48.799 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.machineId: 28:16:ad:ff:fe:2b:7c:b7 (auto-detected)  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 16  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 16  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimIntervalMillis: 0  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: true  16:50:48.809 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023  16:50:48.813 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.allocator.type: pooled  16:50:48.813 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0  16:50:48.814 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384  16:50:48.828 [reactor-http-epoll-2] DEBUG reactor.netty.resources.PooledConnectionProvider - [id:d962b126] Created a new pooled channel, now: 0 active connections, 0 inactive connections and 0 pending acquire requests.  16:50:48.845 [reactor-http-epoll-2] DEBUG reactor.netty.tcp.SslProvider - [id:d962b126] SSL enabled using engine sun.security.ssl.SSLEngineImpl@55608030 and SNI stream.binance.com/<unresolved>:9443  16:50:48.852 [reactor-http-epoll-2] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkAccessible: true  16:50:48.853 [reactor-http-epoll-2] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkBounds: true  16:50:48.853 [reactor-http-epoll-2] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@3ba51dc6  16:50:48.854 [reactor-http-epoll-2] DEBUG reactor.netty.transport.TransportConfig - [id:d962b126] Initialized pipeline DefaultChannelPipeline{(reactor.left.sslHandler = io.netty.handler.ssl.SslHandler), (reactor.left.sslReader = reactor.netty.tcp.SslProvider$SslReadHandler), (reactor.left.httpCodec = io.netty.handler.codec.http.HttpClientCodec), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}  16:50:48.866 [reactor-http-epoll-1] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@1fb356c5  16:50:48.867 [reactor-http-epoll-1] DEBUG io.netty.resolver.dns.DnsQueryContext - [id: 0xdd7103d7] WRITE: UDP, [11524: /127.0.0.53:53], DefaultDnsQuestion(stream.binance.com. IN A)  16:50:48.869 [reactor-http-epoll-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096  16:50:48.869 [reactor-http-epoll-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxSharedCapacityFactor: 2  16:50:48.869 [reactor-http-epoll-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.linkCapacity: 16  16:50:48.869 [reactor-http-epoll-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8  16:50:48.869 [reactor-http-epoll-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.delayedQueue.ratio: 8  16:50:48.878 [reactor-http-epoll-1] DEBUG io.netty.resolver.dns.DnsQueryContext - [id: 0xdd7103d7] WRITE: UDP, [33872: /127.0.0.53:53], DefaultDnsQuestion(stream.binance.com. IN AAAA)  16:50:48.904 [reactor-http-epoll-1] DEBUG io.netty.resolver.dns.DnsNameResolver - [id: 0xdd7103d7] RECEIVED: UDP [11524: /127.0.0.53:53], DatagramDnsResponse(from: /127.0.0.53:53, 11524, QUERY(0), NoError(0), RD RA)      DefaultDnsQuestion(stream.binance.com. IN A)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(stream.binance.com. 12 IN A 4B)      DefaultDnsRawRecord(OPT flags:0 udp:65494 0B)  16:50:48.907 [reactor-http-epoll-2] DEBUG reactor.netty.transport.TransportConnector - [id:d962b126] Connecting to [stream.binance.com/52.199.12.133:9443].  16:50:48.907 [reactor-http-epoll-1] DEBUG io.netty.resolver.dns.DnsNameResolver - [id: 0xdd7103d7] RECEIVED: UDP [33872: /127.0.0.53:53], DatagramDnsResponse(from: /127.0.0.53:53, 33872, QUERY(0), NoError(0), RD RA)      DefaultDnsQuestion(stream.binance.com. IN AAAA)      DefaultDnsRawRecord(OPT flags:0 udp:65494 0B)  16:50:49.162 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Registering pool release on close event for channel  16:50:49.163 [reactor-http-epoll-2] DEBUG reactor.netty.resources.PooledConnectionProvider - [id:d962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Channel connected, now: 1 active connections, 0 inactive connections and 0 pending acquire requests.  16:50:49.807 [reactor-http-epoll-2] DEBUG io.netty.handler.ssl.SslHandler - [id: 0xd962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] HANDSHAKEN: protocol:TLSv1.2 cipher suite:TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256  16:50:49.808 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] onStateChange(PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443]}, [connected])  16:50:49.826 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] onStateChange(GET{uri=/, connection=PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443]}}, [configured])  16:50:49.826 [reactor-http-epoll-2] DEBUG reactor.netty.http.client.HttpClientConnect - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Handler is being applied: {uri=wss://stream.binance.com:9443/ws, method=GET}  16:50:49.830 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] onStateChange(GET{uri=/ws, connection=PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443]}}, [request_prepared])  16:50:49.839 [reactor-http-epoll-2] DEBUG reactor.netty.ReactorNetty - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Added encoder [reactor.left.httpAggregator] at the beginning of the user pipeline, full pipeline: [reactor.left.sslHandler, reactor.left.httpCodec, reactor.left.httpAggregator, reactor.right.reactiveBridge, DefaultChannelPipeline$TailContext#0]  16:50:49.839 [reactor-http-epoll-2] DEBUG reactor.netty.ReactorNetty - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Non Removed handler: reactor.left.httpMetricsHandler, context: null, pipeline: DefaultChannelPipeline{(reactor.left.sslHandler = io.netty.handler.ssl.SslHandler), (reactor.left.httpCodec = io.netty.handler.codec.http.HttpClientCodec), (reactor.left.httpAggregator = io.netty.handler.codec.http.HttpObjectAggregator), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}  16:50:49.840 [reactor-http-epoll-2] DEBUG reactor.netty.http.client.HttpClientOperations - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Attempting to perform websocket handshake with wss://stream.binance.com:9443/ws  16:50:49.846 [reactor-http-epoll-2] DEBUG io.netty.handler.codec.http.websocketx.WebSocketClientHandshaker13 - WebSocket version 13 client handshake key: 7FNVb427OHllyiM2Clg//g==, expected response: iTvQFIKtv7xyyXvmEAooh8NZPVw=  16:50:50.122 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] onStateChange(ws{uri=/ws, connection=PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443]}}, [response_received])  16:50:50.135 [reactor-http-epoll-2] DEBUG org.springframework.web.reactive.socket.adapter.ReactorNettyWebSocketSession - [36eb4d6b] Session id "36eb4d6b" for wss://stream.binance.com:9443/ws  16:50:50.135 [reactor-http-epoll-2] DEBUG org.springframework.web.reactive.socket.client.ReactorNettyWebSocketClient - Started session '36eb4d6b' for wss://stream.binance.com:9443/ws  16:50:50.147 [reactor-http-epoll-2] DEBUG reactor.netty.ReactorNetty - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] Added decoder [reactor.left.wsFrameAggregator] at the end of the user pipeline, full pipeline: [reactor.left.sslHandler, reactor.left.httpCodec, ws-decoder, ws-encoder, reactor.left.wsFrameAggregator, reactor.right.reactiveBridge, DefaultChannelPipeline$TailContext#0]  16:50:50.149 [reactor-http-epoll-2] DEBUG reactor.netty.channel.FluxReceive - [id:d962b126-1, L:/192.168.1.5:44690 - R:stream.binance.com/52.199.12.133:9443] FluxReceive{pending=0, cancelled=false, inboundDone=false, inboundError=null}: subscribing inbound receiver  16:50:50.150 [reactor-http-epoll-2] INFO TRACE - onSubscribe(FluxMap.MapSubscriber)  16:50:50.150 [reactor-http-epoll-2] INFO TRACE - request(256)  16:50:50.411 [reactor-http-epoll-2] INFO TRACE - onNext(evt)  16:50:50.413 [reactor-http-epoll-2] INFO TRACE - request(1)  ...  16:52:16.652 [reactor-http-epoll-2] INFO TRACE - onNext(evt)  16:52:16.652 [reactor-http-epoll-2] INFO TRACE - request(1)  16:52:17.168 [reactor-http-epoll-2] DEBUG reactor.netty.resources.PooledConnectionProvider - [id:d962b126-1, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443] Channel closed, now: 0 active connections, 0 inactive connections and 0 pending acquire requests.  16:52:17.169 [reactor-http-epoll-2] DEBUG reactor.netty.ReactorNetty - [id:d962b126-1, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443] Non Removed handler: reactor.left.httpAggregator, context: null, pipeline: DefaultChannelPipeline{(reactor.left.sslHandler = io.netty.handler.ssl.SslHandler), (ws-decoder = io.netty.handler.codec.http.websocketx.WebSocket13FrameDecoder), (ws-encoder = io.netty.handler.codec.http.websocketx.WebSocket13FrameEncoder), (reactor.left.wsFrameAggregator = io.netty.handler.codec.http.websocketx.WebSocketFrameAggregator), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)}  A completed  A terminated  16:52:17.172 [reactor-http-epoll-2] INFO TRACE - onComplete()  B completed  B terminated  C success  C terminated  16:52:17.177 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443] onStateChange(ws{uri=/ws, connection=PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443]}}, [response_completed])  16:52:17.177 [reactor-http-epoll-2] DEBUG reactor.netty.resources.DefaultPooledConnectionProvider - [id:d962b126, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443] onStateChange(ws{uri=/ws, connection=PooledConnection{channel=[id: 0xd962b126, L:/192.168.1.5:44690 ! R:stream.binance.com/52.199.12.133:9443]}}, [disconnecting])  

I tried to reproduce the issue using another technology like javascript but everything runs fine. It seems that the channel is closed so I tried to tune the ChannelOptions at TcpClient level... still no luck !

TcpClient wsTcp = TcpClient.create();  wsTcp.option(ChannelOption.AUTO_CLOSE, Boolean.FALSE);  wsTcp.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, Integer.MAX_VALUE);  wsTcp.option(ChannelOption.AUTO_READ, Boolean.TRUE);  wsTcp.option(ChannelOption.SO_KEEPALIVE, Boolean.TRUE);  wsTcp.option(ChannelOption.SO_TIMEOUT, Integer.MAX_VALUE);  

I provided a java sample code to reproduce the issue:

package test;    import java.net.URI;  import java.util.concurrent.CountDownLatch;    import org.springframework.web.reactive.socket.client.ReactorNettyWebSocketClient;    import reactor.core.publisher.EmitterProcessor;  import reactor.core.publisher.Flux;  import reactor.core.publisher.Mono;    public class wsTest {        public static void main(String[] args) throws InterruptedException {          CountDownLatch latch = new CountDownLatch(1);          ReactorNettyWebSocketClient wsclient = new ReactorNettyWebSocketClient();          wsclient.setMaxFramePayloadLength(Integer.MAX_VALUE);          EmitterProcessor<String> output = EmitterProcessor.create();          Mono<Void> execMono = wsclient.execute(URI.create("wss://stream.binance.com:9443/ws"),                  session -> session.send(Flux.just(session.textMessage("{\"method\": \"SUBSCRIBE\",\"params\":[\"!ticker@arr\"],\"id\": 1}")))                          .thenMany(session                                  .receive()                                  .doOnCancel(() -> System.out.println("A cancelled"))                                  .doOnComplete(() -> System.out.println("A completed"))                                  .doOnTerminate(() -> System.out.println("A terminated"))                                  .map(x -> "evt")                                  .log("TRACE")                                  .subscribeWith(output).then())                          .then());                    output.doOnCancel(() -> System.out.println("B cancelled"))                  .doOnComplete(() -> System.out.println("B completed"))                  .doOnTerminate(() -> System.out.println("B terminated"))                  .doOnSubscribe(s -> execMono                          .doOnCancel(() -> System.out.println("C cancelled"))                          .doOnSuccess(x -> System.out.println("C success"))                          .doOnTerminate(() -> System.out.println("C terminated"))                          .subscribe())                  .subscribe();            latch.await();      }  }  

I don't understand why I get completed/terminated event from ReactorNettyWebSocketClient WebSocketHandler ?

Thank you for your help,

CRONTAB: Cron Job Executing But Not Writing To File

Posted: 15 Aug 2021 08:05 AM PDT

I have a crontab job here to push to Github.

It works if I run my script manually but when it executes using the cron job my txt file isn't updated and nothing is pushed upstream.

I've also made sure run sudo chmod +x /Users/myuser/Development/auto-cal/test.sh

Not sure if I'm missing anything else - any guidance would be much appreciated!

Shell Script:

#!/usr/bin/env bash    echo $((1 + $RANDOM % 100)) > main.txt    git add .    DATE=$(date)    git commit -m "$DATE"    git push -u origin main    osascript -e 'display notification "pushed to remote" with title "SUCCESS"'  

Crontab Job:

* * * * * sh /Users/myuser/Development/auto-cal/test.sh >/dev/null 2>&1  

HSTS header is not seen in image /JS/CSS files

Posted: 15 Aug 2021 08:05 AM PDT

I have added the following line in my .htaccess file to set the HSTS header

Header always set Strict-Transport-Security "max-age=63072000;includeSubDomains;"

To test the HSTS header, I have done the below steps:

  1. Access the application in the chrome browser
  2. Open the developer tool and check the HSTS header in the Response headers

The first time when I access the application and verified it in the developer tool, I could see the HSTS header in all the PHP files, image files, CSS files, and JS files. Again when I load the application, I could see images, CSS, and JS files are loaded from the memory/disk cache and these files are missing the HSTS header in response headers. However, PHP files still shows the HSTS header in the reponse headers.

Missing HSTS header in the resource files(image/CSS/JS files) is accepted? OR It is considered as security hole? If so how can I fix this?

Thank you in advance

How can I access a POST response globally in Kotlin?

Posted: 15 Aug 2021 08:05 AM PDT

I have a kotlin script that gets a token from a server. It then uses that token to perform another request. But, I can't seem to use the token in the next request because the token has a local scope.

This is my code:

class requests{      private val client = OkHttpClient()      private val coroutineScope = CoroutineScope(Dispatchers.Default)        fun run()  {          coroutineScope.launch {              val postBody = "encodedinfo".trimMargin()                val request = Request.Builder()                  .url("https://example.com")                  .post(postBody.toRequestBody(MEDIA_TYPE_MARKDOWN))                  .build()                client.newCall(request).execute().use { response ->                  if (!response.isSuccessful) throw IOException("Unexpected code $response")                    val token = response.body!!.string()              }          }      }        companion object {          val MEDIA_TYPE_MARKDOWN = "text/x-markdown; charset=utf-8".toMediaType()      }      }    

With this code I can't use the token variable outside of the request code block. This block of code:

client.newCall(request).execute().use { response ->      if (!response.isSuccessful) throw IOException("Unexpected code $response")        val token = response.body!!.string()    }  

This is the first request that gets the token variable and I need the token` so that I can use it in another function so that I can perform the second request and load the users info.

Advanced animations question (websites). How to make image animations like on the following website

Posted: 15 Aug 2021 08:04 AM PDT

http://www.mikimottes.com/ this is the website I'm asking about! I've learned how to make simple animations. But when I discovered this website, I fell in love with it and want to learn how to make such websites. And I'm wondering how can for example the images animations, how are they developed? I have a few suggestions that it may be something related to GIFs but how then the GIF can be controlled? As sometimes animations of images stop and then something triggers them to animate. It seems impossible with GIFs. It doesn't seem like a js animation. Please assist me on that question. :)

MyPy: Can not suppress [no-untyped-call]

Posted: 15 Aug 2021 08:04 AM PDT

I have been using MyPy python type analysis, I am trying to ignore a specific type warning,

Here is my mypy.ini

[mypy]  python_version = 3.9    # Needed because of bug in MyPy  disallow_subclassing_any = False    # Options to make the checking stricter.  check_untyped_defs = True  disallow_untyped_defs = True  disallow_untyped_calls = True    #Plugins  plugins = numpy.typing.mypy_plugin    [mypy-numpy.*]  allow_untyped_defs = True  allow_untyped_calls = True  implicit_reexport = True  

Here is my sample py code,

import numpy as np  import numpy.typing as npt  def test() -> npt.NDArray[np.float32]:      distance_win: npt.NDArray[np.float32] = np.kron(          np.ones((10, 1))          , np.ones((10, 1))      )      print (distance_win)      return distance_win    if __name__ == '__main__':      print (test())  

When I run the mypy like below,

mypy --config-file mypy.ini tests/experiments/py_templ.py  

I get following output

tests\experiments\py_templ.py:32: error: Call to untyped function "kron" in typed context  [no-untyped-call]  Found 1 error in 1 file (checked 1 source file)  

But, in the ini file , I allow allow_untyped_calls , Why MyPy still complains?

In general I could not get the scoping of the MyPy.ini specification. Please help.

GOPATH/GOROOT setup for Hyperledger fabric chaincode

Posted: 15 Aug 2021 08:04 AM PDT

I have done basic setup in Hyperledger Fabric 1.4.7 and made all peer's to join the channel. After exploring chaincode in golang, I started trying chaincode install command and I was facing issue's in this step.

error

Error: error getting chaincode code mycc: path to chaincode does not exist: /home/user899/go/src/chaincode

When I just ran go run chaincode.go

error

chaincode.go:7:2: cannot find package "github.com/hyperledger/fabric-chaincode-go/shim" in any of: /usr/lib/go-1.14/src/github.com/hyperledger/fabric-chaincode-go/shim (from $GOROOT) /home/user899/go/src/github.com/hyperledger/fabric-chaincode-go/shim (from $GOPATH) chaincode.go:8:2: cannot find package "github.com/hyperledger/fabric-protos-go/peer" in any of: /usr/lib/go-1.14/src/github.com/hyperledger/fabric-protos-go/peer (from $GOROOT) /home/user899/go/src/github.com/hyperledger/fabric-protos-go/peer (from $GOPATH)

Environments

echo $GOPATH  /home/user899/go    Chaincode location  /home/user899/SoloNetwork/chaincode  

Can anyone help.

Thanks

Doorkeeper Api authenticaiton with token based tutorial needed

Posted: 15 Aug 2021 08:04 AM PDT

Doorkeeper gem tutorial with accesstoken generation for different users with their email and password to generate access token, as i want to get user email and password and generate access token for it, with doorkeeper, is there any tutorial for it

Django `LookupError: App 'accounts' doesn't have a 'User' model` causes AUTH_USER_MODEL fails with `accounts.User` has not been installed

Posted: 15 Aug 2021 08:03 AM PDT

I am trying to refactor an existing code base by creating new accounts app with new custom User model. When I try to do makemigrations, I get the following error:

Traceback (most recent call last):    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/apps/config.py", line 268, in get_model      return self.models[model_name.lower()]  KeyError: 'user'    During handling of the above exception, another exception occurred:    Traceback (most recent call last):    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/auth/__init__.py", line 160, in get_user_model      return django_apps.get_model(settings.AUTH_USER_MODEL, require_ready=False)    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/apps/registry.py", line 211, in get_model      return app_config.get_model(model_name, require_ready=require_ready)    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/apps/config.py", line 270, in get_model      raise LookupError(  LookupError: App 'accounts' doesn't have a 'User' model.    During handling of the above exception, another exception occurred:    Traceback (most recent call last):    File "/home/dev/Projects/Foto-Dino/foto-dino/manage.py", line 22, in <module>      main()    File "/home/dev/Projects/Foto-Dino/foto-dino/manage.py", line 18, in main      execute_from_command_line(sys.argv)    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line      utility.execute()    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/core/management/__init__.py", line 395, in execute      django.setup()    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/__init__.py", line 24, in setup      apps.populate(settings.INSTALLED_APPS)    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/apps/registry.py", line 122, in populate      app_config.ready()    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/admin/apps.py", line 27, in ready      self.module.autodiscover()    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/admin/__init__.py", line 24, in autodiscover      autodiscover_modules('admin', register_to=site)    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/utils/module_loading.py", line 47, in autodiscover_modules      import_module('%s.%s' % (app_config.name, module_to_search))    File "/usr/lib/python3.9/importlib/__init__.py", line 127, in import_module      return _bootstrap._gcd_import(name[level:], package, level)    File "<frozen importlib._bootstrap>", line 1030, in _gcd_import    File "<frozen importlib._bootstrap>", line 1007, in _find_and_load    File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked    File "<frozen importlib._bootstrap>", line 680, in _load_unlocked    File "<frozen importlib._bootstrap_external>", line 850, in exec_module    File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/auth/admin.py", line 6, in <module>      from django.contrib.auth.forms import (    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/auth/forms.py", line 21, in <module>      UserModel = get_user_model()    File "/home/dev/.virtualenvs/foto-dino/lib/python3.9/site-packages/django/contrib/auth/__init__.py", line 164, in get_user_model      raise ImproperlyConfigured(  django.core.exceptions.ImproperlyConfigured: AUTH_USER_MODEL refers to model 'accounts.User' that has not been installed    

I know this question has been asked many times before, and I have looked through many previous posts, but none of the solutions worked. The main one that kept coming up was the adding accounts to INSTALLED_APPS and setting AUTH_USER_MODEL to "accounts.User". I tried deleting all migrations folders and deleted the database file. The accounts.model.py looks like this:

from django.contrib.auth.base_user import BaseUserManager  from django.contrib.auth.models import AbstractBaseUser, PermissionsMixin  from django.utils.translation import ugettext_lazy as _  from django.db import models  from rest_framework.authtoken.models import Token  from django.contrib.auth.hashers import make_password  from utils.uid import uidgen    class UserManager(BaseUserManager):      def create_user(self, email, password, **extra_fields):          """          Create and save a User with the given email and password.          """          if not email:              raise ValueError(_("The Email must be set"))          email = self.normalize_email(email)          password = make_password(password)          user = self.model(email=email, **extra_fields)          user.save()          if user.role == User.Roles.SECRETARY:              user.is_staff = True          Token.objects.create(user=user)          return user        def create_superuser(self, email, password, **extra_fields):          """          Create and save a SuperUser with the given email and password.          """          extra_fields.setdefault("is_staff", True)          extra_fields.setdefault("is_superuser", True)          extra_fields.setdefault("is_active", True)            admin_user = self.create_user(email, password, **extra_fields)          admin_user.is_staff = True          admin_user.is_superuser = True          admin_user.save()          return admin_user      class User(AbstractBaseUser, PermissionsMixin):      id = models.CharField(          max_length=8, unique=True, primary_key=True, default=uidgen, editable=False      )      first_name = models.CharField(_("First Name"), max_length=30, null=False)      last_name = models.CharField(_("Last Name"), max_length=30, null=True, blank=True)      email = models.EmailField(          verbose_name="Email", max_length=255, unique=True, null=False      )      ...      # more fields here      ...        class Meta:          abstract = True        def __str__(self):          return f"{self.first_name} {self.last_name}"        objects = UserManager()      class Client(User):      # client details here      pass  

In Settings.py I have:

  ...  # settings above  INSTALLED_APPS = [      "django.contrib.admin",      "django.contrib.auth",      "django.contrib.contenttypes",      "django.contrib.sessions",      "django.contrib.messages",      "django.contrib.staticfiles",      "social_django",      "accounts",      # "admin_portal",      # "main",      # "photographer",      # "photoshoper",      # "printer",      # "secretary",      # "phonenumber_field",  ]  ...  # more settings here  ...  AUTH_USER_MODEL = "accounts.User"    

I even tried adding the following to my accounts.admin.py:

from django.contrib import admin  from accounts.models import User    admin.sites.register(User)  

Combine two JSON then add title

Posted: 15 Aug 2021 08:05 AM PDT

I have two JSON objects and I'm planning to join them.

obj1 = [{'foo': 'bar', 'foo1': 'bar1'}];  obj2 = [{'foo2': 'bar2'}];    objFinal = [    {      "foo": "bar",       "foo1": "bar1",      "title":       [        { "foo2": "bar2" }      ]     }  ]  

My plan is to get the result the same as objFinal. I've used .concat but it didn't returned the result I wanted. What are other options I can do to get the result as same as objFinal?

Numbers of array to sort input by user. Sort an an array A[], size N containing positive integers. arrange the elements of array in increasing order

Posted: 15 Aug 2021 08:05 AM PDT

input by user-

  • Number of arrays to be sorted:
  • Size of array:
  • Array elements :

Like -

2  3  5 4 1  4  8 7 5 4  

Want to hide all tags after 3 tags and work with all other also like if i use one more time HTML code it work with first one and also with the second

Posted: 15 Aug 2021 08:04 AM PDT

HTML--> if i write html 2 times then the javascript apply for both of them diferently. It will be very helpful plz

                    <li>Windows 11</li>                      <li> 8GB Ram</li>                      <li>1TB SSD</li>                      <li>Intel Core i7 (11th Gen)</li>                      <li>NVIDIA GeForce GTX 1050</li>                      <li>15.6 inch Full HD Display</li>                      <li> Dolby Audio</li>                      <li>1 Year Onsite Warranty</li>                  </ul>                  <button id="demo" onclick="reed(this)">Show More</button><br>enter code here  

when button is clicked two times it shows tags i want that on 1 click plz It will be very helpful plz to solve this bug

CSS--> Now we can hide the tags after 5tags but there are other problems

.deviceNameCardData li:nth-child(1n+6){font-size: 0px;}

JavaScript--> Want to use same below code for every time whether there are 2 or 3 above html code.Till now i can use only for 1 time for a code.It will be very helpful plz

function  reed(ddd){      var edd = document.getElementById("deviceNameCardData").getElementsByTagName("li")        for (var i = 5; i < edd.length; i++) {           if (edd[i].style.fontSize === "0px") {              edd[i].style.fontSize = "16px";              ddd.innerHTML = "Show Less";          } else if (edd.length > 5) {              ddd.innerHTML = "Show More";              for (var i = 5; i < edd.length; i++) {                  edd[i].style.fontSize = "0px";              }          }        }  }  

Can I use OOP in Redux Store actions

Posted: 15 Aug 2021 08:04 AM PDT

I'm using React JS and Redux JS. I know that Redux actions functions should be a pure function

But, I was reading about Object Oriented Programming (OOP) and its benefits.

So, Can I use OOP in Redux action (Class) instead of pure functions ??

For example: If I've a state named "articles", It may have actions like:

function getArtcles() { ... }  function addArtcle({data}) { ... }  function deleteArtcle(id) { ... }  ......  

So, can I replce that with:

 class Artice {      getArtcles() {...}      addArtcle() {...}      deleteArtcle() {...}  }  

??

What is the purpose of compare(key, key) in Java TreeMap put() method when root == null?

Posted: 15 Aug 2021 08:05 AM PDT

I am reading Java 1.8 TreeMap source code. The put method is written as below:

public V put(K key, V value) {      Entry<K,V> t = root;      if (t == null) {          compare(key, key); // type (and possibly null) check            root = new Entry<>(key, value, null);          size = 1;          modCount++;          return null;      }  ...... //more code  }  

When root == null, which means that it is the first time for us to put an Entry, why we still need to compare the key itself before we assign the entry to root?

BTW, compare method is :

final int compare(Object k1, Object k2) {      return comparator==null ? ((Comparable<? super K>)k1).compareTo((K)k2)          : comparator.compare((K)k1, (K)k2);  }  

assign column names in pandas from pd.series an loop

Posted: 15 Aug 2021 08:04 AM PDT

Let say my df:

A User1 User2  1 0       0  2 1       1  3 1       0  

And I want create another column in dataframe that was sum of column user1 plus user So it's easy: df['sum_result_Newcolum'] = df['user1']+df['user2']. So it looks like this:

 A User1 User2   sum_result_Newcolum      1 0       0   0      2 1       1   2      3 1       0   1  

But I have panda.series look like this:

    0  0   AC.IR  1   AC.PO  2   B.0  3   B.1  

end etc. So I want create for statement there run through pd.series and add value to column name. So that issue, I try something like this without any success:

df['sum_result_'+ {cv_details.get(0)}]=df['user1']+df['user2']  df['sum_result_'+ cv_details.get(0)]=df['user1']+df['user2']  df['sum_result_'+ cv_details[0]]=df['user1']+df['user2']  df['sum_result_'+ cv_details.iloc[0]]=df['user1']+df['user2']  

All time best result is column name: "sum_result_Action..."

Maybe solution convert pd.series to frame?

PyPI package: found module but no type hints or library stubs

Posted: 15 Aug 2021 08:04 AM PDT

I just made my first python package and uploaded it to PyPI (https://pypi.org/project/pygraphsearch/).

I then made some test code that uses it. I ran pip install pygraphsearch to download my package. Everything works fine except mypy complains that "found module but no type hints or library stubs". I used typing everywhere in my package, so I don't know why it complains about that. What am I doing wrong?

Update

I added a py.typed file, like it says on https://mypy.readthedocs.io/en/stable/installed_packages.html#making-pep-561-compatible-packages, but it still doesn't work.

I also added the package_data parameter in the call to setuptools.setup in setup.py.

package_dir={"": "src"},  package_data={"pygraphsearch": ["py.typed"]},  

My file structure in case it's relevant looks like this:

pygraphsearch  ├── pyproject.toml  ├── py.typed  ├── README.md  ├── setup.py  └── src      └── pygraphsearch          ├── Edge.py          ├── Frontier.py          ├── __init__.py          ├── IterativeDeepeningFrontier.py          ├── Node.py          ├── search.py          ├── Stack.py          ├── State.py          └── UndirectedEdge.py  

How do I pass the user info from the detailpage unto another page?

Posted: 15 Aug 2021 08:06 AM PDT

Flutter Firestore - StreamBuilder Users > ListView Users > DetailPage User > Message User

So I have a stream of my Users Collection from Firestore.

On tap it goes to the detailpage (profile) of that user. But now I also want to integrate the possibility to send that user a message. So I need to pass the data from the detailpage to another page. How do I pass the user info from the detailpage unto another page?

GAS: Set Chart Vertical Axis > Show Axis Line

Posted: 15 Aug 2021 08:04 AM PDT

I'm using some script to calculate min and max values for the Vertical Axis depending on the chart's value range, which works fine.

  chart = chart.modify()      .setOption('vAxis.viewWindow.min',minVal)      .setOption('vAxis.viewWindow.max',maxVal)      .build();    sheet.updateChart(chart);  

However, every time I use this, the Vertical Axis > Show Axis Line checkbox in the Chart Editor gets disabled. I've searched for an Option at Line Chart Options to set this in the function, but there doesn't seem to be one.

I've also tried to use a Macro when enable this setting, to see in the resultant code which Option is being set, and another Macro when disabling it, but both macros are identical, and do not show any such Option.

Anyone have a suggestion on how to set (or retain) this chart option?

Improving speed of bit copying in a lossless audio encoding algorithm (written in C)

Posted: 15 Aug 2021 08:05 AM PDT

I'm trying to implement a lossless audio codec that will be able to process data coming in at roughly 190 kHz to then be stored to an SD card using SPI DMA. I've found that the algorithm basically works, but has certain bottlenecks that I can't seem to overcome. I was hoping to get some advice on how to best optimize a certain portion of the code that I found to be the "slowest". I'm writing in C on a TI DSP and am using -O3 optimization.

        for (j = packet_to_write.bfp_bits; j>0; j--)          {              encoded_data[(filled/16)] |= ((buf_filt[i] >> (j- 1)) & 1) << (filled++ % 16);            }  

In this section of code, I am taking X number of bits from the original data and fitting it into a buffer of encoded data. I've found that the loop is fairly costly and when I am working with a set of data represented by 8+ bits, then this code is too slow for my application. Loop unrolling doesn't really work here since each block of data can be represented by a different number of bits. The "filled" variable represents a bit counter filling up Uint16 indices in the encoded_data buffer.

I'd like some help understanding where bottlenecks may come from in this snippet of code (and hopefully I can take those findings and apply that to other areas of the algo). The authors of the paper that I'm reading (whose algorithm I'm trying to replicate) noted that they used a mixture of C and assembly code, but I'm not sure how assembly would be useful in this case.

Finally, the code itself is functional and I have done some extensive testing on actual audio samples. It's just not fast enough for real-time!

Thanks!

How to get accesstoken from WebClient like in RestTemplate?

Posted: 15 Aug 2021 08:05 AM PDT

I was going through the Spring Security Oauth 2.x migration to Spring security 5.2 and encountered the following blocker. As I can see OAuth2RestTemplate is not used anymore, instead WebClient is recommended. So i was making changes to my codebase to make webclient work.

In Oauth2 we have an option to get token straight from RestTemplate using oAuth2RestTemplate.getAccessToken(), i couldn't find anything similar in WebClient. This is to call an external microservice. Is there any option to get the accesstoken from webclient? Or is it handled in a different way there?

Convert Numpy Arrrays to a Tensor

Posted: 15 Aug 2021 08:06 AM PDT

I converted a file to dataframe with pandas and now I would like to train a Deep Learning model via TensorFlow. I don't succeed to train the model: after dividing in training and test set, when I go to compile the model it tells me

ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type   numpy.ndarray).  

I thought the problem was that the numpy arrays had different sizes, but despite performing padding (in this way all the arrays had the same dimension inside the column), the problem was not solved. Below I insert an example of a column I have inside the dataset: if I wanted to transform this into a tensor, how should I do it?

df = pd.read_parquet('example.parquet')  df['column']    0                            [0, 1, 1, 1, 0, 1, 0, 1, 0]  1          [0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0]  2          [0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 1]  3                      [0, 1, 1, 1, 1, 1, 0, 1, 0, 1, 1]  4                   [0, 1, 1, 1, 1, 1, 0, 1, 0, 1, 1, 0]                           ...                          115                          [0, 1, 0, 0, 1, 1, 1, 1, 1]  116    [0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 1, 1, 1, 1, 1, ...  117     [0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1]  118    [0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, ...  119                    [0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1]  

Clearly I have inserted the original column, not the one on which I padded unsuccessfully.

These are the steps that I did to train the model, if they can be useful

from sklearn.preprocessing import LabelEncoder  label_encoder = LabelEncoder()  Y = label_encoder.fit_transform(Y)  X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size = 0.3, random_state = 42)  #create model  model = Sequential()    #add model layers  model.add(BatchNormalization())  model.add(Dense(20, activation='softmax', input_shape=(X_train.shape)))    # compile model  model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])  model.fit(X_train, y_train, validation_data=(X_test, y_test), epochs=50)  

UPDATE: Complete traceback

ValueError: Failed to convert a NumPy array to a Tensor (Unsupported   object type numpy.ndarray).  ---------------------------------------------------------------------   ------  ValueError                                Traceback (most recent call   last)  ~\AppData\Local\Temp/ipykernel_16380/3421148994.py in <module>    1 from livelossplot import PlotLossesKeras    2   ----> 3 model.fit(X_train, y_train, validation_data=(X_test, y_test),   epochs=50, callbacks=[PlotLossesKeras()])    ~\AppData\Local\Programs\Python\Python39\lib\site-   packages\keras\engine\training.py in fit(self, x, y, batch_size,   epochs, verbose, callbacks, validation_split, validation_data,   shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch,   validation_steps, validation_batch_size, validation_freq,   max_queue_size, workers, use_multiprocessing)  1132          training_utils.RespectCompiledTrainableState(self):  1133       # Creates a `tf.data.Dataset` and handles batch and epoch   iteration.  -> 1134       data_handler = data_adapter.get_data_handler(  1135           x=x,  1136           y=y,    ~\AppData\Local\Programs\Python\Python39\lib\site-   packages\keras\engine\data_adapter.py in get_data_handler(*args,   **kwargs)  1381   if getattr(kwargs["model"], "_cluster_coordinator", None):  1382     return _ClusterCoordinatorDataHandler(*args, **kwargs)  -> 1383   return DataHandler(*args, **kwargs)  1384   1385  ~\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\engine\data_adapter.py in __init__(self, x, y, sample_weight, batch_size, steps_per_epoch, initial_epoch, epochs, shuffle, class_weight, max_queue_size, workers, use_multiprocessing, model, steps_per_execution, distribute)     1136      1137     adapter_cls = select_data_adapter(x, y)  -> 1138     self._adapter = adapter_cls(     1139         x,     1140         y,    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\engine\data_adapter.py in __init__(self, x, y, sample_weights, sample_weight_modes, batch_size, epochs, steps, shuffle, **kwargs)      228                **kwargs):      229     super(TensorLikeDataAdapter, self).__init__(x, y, **kwargs)  --> 230     x, y, sample_weights = _process_tensorlike((x, y, sample_weights))      231     sample_weight_modes = broadcast_sample_weight_modes(      232         sample_weights, sample_weight_modes)    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\engine\data_adapter.py in _process_tensorlike(inputs)     1029     return x     1030   -> 1031   inputs = tf.nest.map_structure(_convert_numpy_and_scipy, inputs)     1032   return tf.__internal__.nest.list_to_tuple(inputs)     1033  ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\util\nest.py in map_structure(func, *structure, **kwargs)      867       868   return pack_sequence_as(  --> 869       structure[0], [func(*x) for x in entries],      870       expand_composites=expand_composites)      871     ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\util\nest.py in <listcomp>(.0)      867       868   return pack_sequence_as(  --> 869       structure[0], [func(*x) for x in entries],      870       expand_composites=expand_composites)      871     ~\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\engine\data_adapter.py in _convert_numpy_and_scipy(x)     1024       if issubclass(x.dtype.type, np.floating):     1025         dtype = backend.floatx()  -> 1026       return tf.convert_to_tensor(x, dtype=dtype)     1027     elif _is_scipy_sparse(x):     1028       return _scipy_sparse_to_sparse_tensor(x)    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\util\dispatch.py in wrapper(*args, **kwargs)      204     """Call target, and fall back on dispatchers if there is a TypeError."""      205     try:  --> 206       return target(*args, **kwargs)      207     except (TypeError, ValueError):      208       # Note: convert_to_eager_tensor currently raises a ValueError, not a  ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\ops.py in convert_to_tensor_v2_with_dispatch(value, dtype, dtype_hint, name)     1428     ValueError: If the `value` is a tensor not of given `dtype` in graph mode.     1429   """  -> 1430   return convert_to_tensor_v2(     1431       value, dtype=dtype, dtype_hint=dtype_hint, name=name)     1432     ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\ops.py in convert_to_tensor_v2(value, dtype, dtype_hint, name)     1434 def convert_to_tensor_v2(value, dtype=None, dtype_hint=None, name=None):     1435   """Converts the given `value` to a `Tensor`."""  -> 1436   return convert_to_tensor(     1437       value=value,     1438       dtype=dtype,    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\profiler\trace.py in wrapped(*args, **kwargs)      161         with Trace(trace_name, **trace_kwargs):      162           return func(*args, **kwargs)  --> 163       return func(*args, **kwargs)      164       165     return wrapped    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\ops.py in convert_to_tensor(value, dtype, name, as_ref, preferred_dtype, dtype_hint, ctx, accepted_result_types)     1564      1565     if ret is None:  -> 1566       ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)     1567      1568     if ret is NotImplemented:  ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\tensor_conversion_registry.py in _default_conversion_function(***failed resolving arguments***)       50 def _default_conversion_function(value, dtype, name, as_ref):       51   del as_ref  # Unused.  ---> 52   return constant_op.constant(value, dtype, name=name)       53        54     ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\constant_op.py in constant(value, dtype, shape, name)      269     ValueError: if called on a symbolic tensor.      270   """  --> 271   return _constant_impl(value, dtype, shape, name, verify_shape=False,      272                         allow_broadcast=True)      273       ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\constant_op.py in _constant_impl(value, dtype, shape, name, verify_shape, allow_broadcast)      281       with trace.Trace("tf.constant"):      282         return _constant_eager_impl(ctx, value, dtype, shape, verify_shape)  --> 283     return _constant_eager_impl(ctx, value, dtype, shape, verify_shape)      284       285   g = ops.get_default_graph()    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\constant_op.py in _constant_eager_impl(ctx, value, dtype, shape, verify_shape)      306 def _constant_eager_impl(ctx, value, dtype, shape, verify_shape):      307   """Creates a constant on the current device."""  --> 308   t = convert_to_eager_tensor(value, ctx, dtype)      309   if shape is None:      310     return t    ~\AppData\Local\Programs\Python\Python39\lib\site-packages\tensorflow\python\framework\constant_op.py in convert_to_eager_tensor(value, ctx, dtype)      104       dtype = dtypes.as_dtype(dtype).as_datatype_enum      105   ctx.ensure_initialized()  --> 106   return ops.EagerTensor(value, ctx.device_name, dtype)      107       108     ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type numpy.ndarray).  

JAR-RS Filter injection of a CDI Singleton that reference a @ConfigMapping object fails in Quarkus

Posted: 15 Aug 2021 08:05 AM PDT

I try to use latest Qarkus configuration tool : @ConfigMapping so I defined a GreetingConfig object:

@ConfigMapping(prefix = "org.acme.greeting")  public interface GreetingConfig {      String prefix();  }  

That config is injected in a CDI Singleton :

@Singleton  public class GreetingService {        @Inject GreetingConfig config;        public String greet(String user) {          return config.prefix().concat(" - Hello ").concat(user).concat(" !!");      }  }  

The singleton is injected in a JAX-RS Filter and in a JAX-RS Resource

@Provider  public class GreetingFilter implements ContainerRequestFilter {        private static final Logger LOGGER = Logger.getLogger(GreetingFilter.class.getName());        @Inject GreetingService service;        @Override      public void filter(ContainerRequestContext requestContext) throws IOException {          LOGGER.log(Level.INFO, service.greet("Quarkus"));      }  }  
@Path("/hello")  public class GreetingResource {        @Inject GreetingService service;        @GET      @Produces(MediaType.TEXT_PLAIN)      public String hello(@QueryParam("user") @DefaultValue("Quarkus") String user) {          return service.greet(user);      }  }  

The application.properties file contains also :

org.acme.greeting.prefix=ACME Greetings  

Application crash at startup :

java.lang.RuntimeException: java.lang.ExceptionInInitializerError        at io.quarkus.test.junit.QuarkusTestExtension.throwBootFailureException(QuarkusTestExtension.java:712)      at io.quarkus.test.junit.QuarkusTestExtension.interceptTestClassConstructor(QuarkusTestExtension.java:785)      at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)      at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)      at org.junit.jupiter.api.extension.InvocationInterceptor.interceptTestClassConstructor(InvocationInterceptor.java:72)      at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)      at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)      at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)      at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)      at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)      at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)      at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:77)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.invokeTestClassConstructor(ClassBasedTestDescriptor.java:342)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.instantiateTestClass(ClassBasedTestDescriptor.java:289)      at org.junit.jupiter.engine.descriptor.ClassTestDescriptor.instantiateTestClass(ClassTestDescriptor.java:79)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.instantiateAndPostProcessTestInstance(ClassBasedTestDescriptor.java:267)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.lambda$testInstancesProvider$2(ClassBasedTestDescriptor.java:259)      at java.base/java.util.Optional.orElseGet(Optional.java:369)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.lambda$testInstancesProvider$3(ClassBasedTestDescriptor.java:258)      at org.junit.jupiter.engine.execution.TestInstancesProvider.getTestInstances(TestInstancesProvider.java:31)      at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$prepare$0(TestMethodTestDescriptor.java:101)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.prepare(TestMethodTestDescriptor.java:100)      at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.prepare(TestMethodTestDescriptor.java:65)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$prepare$1(NodeTestTask.java:111)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.prepare(NodeTestTask.java:111)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:79)      at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)      at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129)      at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84)      at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)      at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:143)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:129)      at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:127)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:126)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:84)      at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)      at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)      at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)      at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:108)      at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)      at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)      at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)      at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)      at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:96)      at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:75)      at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:71)      at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)      at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)      at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)  Caused by: java.lang.ExceptionInInitializerError      at java.base/java.lang.Class.forName0(Native Method)      at java.base/java.lang.Class.forName(Class.java:398)      at io.quarkus.runner.bootstrap.StartupActionImpl.run(StartupActionImpl.java:165)      at io.quarkus.test.junit.QuarkusTestExtension.doJavaStart(QuarkusTestExtension.java:380)      at io.quarkus.test.junit.QuarkusTestExtension.ensureStarted(QuarkusTestExtension.java:680)      at io.quarkus.test.junit.QuarkusTestExtension.beforeAll(QuarkusTestExtension.java:727)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.lambda$invokeBeforeAllCallbacks$8(ClassBasedTestDescriptor.java:368)      at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.invokeBeforeAllCallbacks(ClassBasedTestDescriptor.java:368)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.before(ClassBasedTestDescriptor.java:192)      at org.junit.jupiter.engine.descriptor.ClassBasedTestDescriptor.before(ClassBasedTestDescriptor.java:78)      at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:136)      ... 31 more  Caused by: java.lang.RuntimeException: Failed to start quarkus      at io.quarkus.runner.ApplicationImpl.<clinit>(ApplicationImpl.zig:225)      ... 43 more  Caused by: java.lang.RuntimeException: Error injecting org.acme.GreetingService org.acme.api.filter.GreetingFilter.service      at org.acme.api.filter.GreetingFilter_Bean.create(GreetingFilter_Bean.zig:148)      at org.acme.api.filter.GreetingFilter_Bean.create(GreetingFilter_Bean.zig:171)      at io.quarkus.arc.impl.AbstractSharedContext.createInstanceHandle(AbstractSharedContext.java:96)      at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:29)      at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:26)      at io.quarkus.arc.impl.LazyValue.get(LazyValue.java:26)      at io.quarkus.arc.impl.ComputingCache.computeIfAbsent(ComputingCache.java:69)      at io.quarkus.arc.impl.AbstractSharedContext.get(AbstractSharedContext.java:26)      at org.acme.api.filter.GreetingFilter_Bean.get(GreetingFilter_Bean.zig:203)      at org.acme.api.filter.GreetingFilter_Bean.get(GreetingFilter_Bean.zig:219)      at io.quarkus.arc.impl.ArcContainerImpl.beanInstanceHandle(ArcContainerImpl.java:433)      at io.quarkus.arc.impl.ArcContainerImpl.beanInstanceHandle(ArcContainerImpl.java:446)      at io.quarkus.arc.impl.ArcContainerImpl$1.get(ArcContainerImpl.java:269)      at io.quarkus.arc.impl.ArcContainerImpl$1.get(ArcContainerImpl.java:266)      at io.quarkus.resteasy.common.runtime.QuarkusConstructorInjector.construct(QuarkusConstructorInjector.java:39)      at org.jboss.resteasy.core.providerfactory.ResteasyProviderFactoryImpl.injectedInstance(ResteasyProviderFactoryImpl.java:1399)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl$AbstractInterceptorFactory.createInterceptor(JaxrsInterceptorRegistryImpl.java:150)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl$OnDemandInterceptorFactory.initialize(JaxrsInterceptorRegistryImpl.java:168)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl$OnDemandInterceptorFactory.checkInitialize(JaxrsInterceptorRegistryImpl.java:183)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl$OnDemandInterceptorFactory.getInterceptor(JaxrsInterceptorRegistryImpl.java:193)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl$AbstractInterceptorFactory.postMatch(JaxrsInterceptorRegistryImpl.java:131)      at org.jboss.resteasy.core.interception.jaxrs.JaxrsInterceptorRegistryImpl.postMatch(JaxrsInterceptorRegistryImpl.java:288)      at org.jboss.resteasy.core.interception.jaxrs.ContainerRequestFilterRegistryImpl.postMatch(ContainerRequestFilterRegistryImpl.java:30)      at org.jboss.resteasy.core.interception.jaxrs.ContainerRequestFilterRegistryImpl.postMatch(ContainerRequestFilterRegistryImpl.java:12)      at org.jboss.resteasy.core.ResourceMethodInvoker.<init>(ResourceMethodInvoker.java:142)      at org.jboss.resteasy.core.ResourceMethodRegistry.processMethod(ResourceMethodRegistry.java:381)      at org.jboss.resteasy.core.ResourceMethodRegistry.register(ResourceMethodRegistry.java:308)      at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:259)      at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:227)      at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:208)      at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:192)      at org.jboss.resteasy.core.ResourceMethodRegistry.addResourceFactory(ResourceMethodRegistry.java:175)      at org.jboss.resteasy.core.ResourceMethodRegistry.addPerRequestResource(ResourceMethodRegistry.java:87)      at org.jboss.resteasy.core.ResteasyDeploymentImpl.registerResources(ResteasyDeploymentImpl.java:518)      at org.jboss.resteasy.core.ResteasyDeploymentImpl.registration(ResteasyDeploymentImpl.java:475)      at org.jboss.resteasy.core.ResteasyDeploymentImpl.startInternal(ResteasyDeploymentImpl.java:164)      at org.jboss.resteasy.core.ResteasyDeploymentImpl.start(ResteasyDeploymentImpl.java:121)      at io.quarkus.resteasy.runtime.standalone.ResteasyStandaloneRecorder.staticInit(ResteasyStandaloneRecorder.java:37)      at io.quarkus.deployment.steps.ResteasyStandaloneBuildStep$staticInit-210558872.deploy_0(ResteasyStandaloneBuildStep$staticInit-210558872.zig:897)      at io.quarkus.deployment.steps.ResteasyStandaloneBuildStep$staticInit-210558872.deploy(ResteasyStandaloneBuildStep$staticInit-210558872.zig:40)      at io.quarkus.runner.ApplicationImpl.<clinit>(ApplicationImpl.zig:205)      ... 43 more  Caused by: java.lang.RuntimeException: Error injecting org.acme.config.GreetingConfig org.acme.GreetingService.config      at org.acme.GreetingService_Bean.create(GreetingService_Bean.zig:199)      at org.acme.GreetingService_Bean.create(GreetingService_Bean.zig:222)      at io.quarkus.arc.impl.AbstractSharedContext.createInstanceHandle(AbstractSharedContext.java:96)      at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:29)      at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:26)      at io.quarkus.arc.impl.LazyValue.get(LazyValue.java:26)      at io.quarkus.arc.impl.ComputingCache.computeIfAbsent(ComputingCache.java:69)      at io.quarkus.arc.impl.AbstractSharedContext.get(AbstractSharedContext.java:26)      at org.acme.GreetingService_Bean.get(GreetingService_Bean.zig:254)      at org.acme.GreetingService_Bean.get(GreetingService_Bean.zig:270)      at org.acme.api.filter.GreetingFilter_Bean.create(GreetingFilter_Bean.zig:131)      ... 83 more  Caused by: java.util.NoSuchElementException: SRCFG00027: Could not find a mapping for org.acme.config.GreetingConfig      at io.smallrye.config.ConfigMappings.getConfigMapping(ConfigMappings.java:73)      at io.smallrye.config.SmallRyeConfig.getConfigMapping(SmallRyeConfig.java:423)      at io.quarkus.arc.runtime.ConfigMappingCreator.create(ConfigMappingCreator.java:28)      at org.acme.config.GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.create(GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.zig:128)      at org.acme.config.GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.get(GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.zig:159)      at org.acme.config.GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.get(GreetingConfig_04302c935a6cb43e5f29ca4271833ab99b620b3b_Synthetic_Bean.zig:182)      at io.quarkus.arc.impl.CurrentInjectionPointProvider.get(CurrentInjectionPointProvider.java:52)      at org.acme.GreetingService_Bean.create(GreetingService_Bean.zig:182)      ... 93 more  

Injection in the JAR-RS resource works whatever bean type the service is (@ApplicationScope nor @Singleton) but in the Filter, only @ApplicationScope bean can be injected avoiding the runtime exception...

Is it a bug or is it normal that we cannot inject a pseudo scope bean with config mapping in a JAX-RS @Provider (Filter or BodyWritter) ?

WebClient throw 'An exception occurred during WebClient request'

Posted: 15 Aug 2021 08:05 AM PDT

I know that question has been ask a lot in the internet, but yet i didn't found a satisfying answer.

private string LocalSqlDriverDownloader()  {      ProgBar prograssBar = new();      string sqlLocalDBUrl = "https://download.microsoft.com/download/7/c/1/7c14e92e-bdcb-4f89-b7cf-93543e7112d1/SqlLocalDB.msi";      string fileName = "SqlLocalDB.msi";      string directory = $@"{Path.GetPathRoot(Environment.SystemDirectory)}Download"; // C:\Download      if (!Directory.Exists(directory))      {          Directory.CreateDirectory(directory);      }      using WebClient webClient = new();      webClient.DownloadProgressChanged += (s, e) =>      {          Application.Current.Dispatcher?.Invoke(() =>          {              (prograssBar.DataContext as PrograssbarWindowViewModel).PrograssBarValue = e.ProgressPercentage;          });      };      webClient.DownloadFileCompleted += (s, e) =>      {          prograssBar.Close();      };      string downloadPath = $@"{directory}\{fileName}";      try      {          webClient.DownloadFile(sqlLocalDBUrl, downloadPath);      }      catch (Exception e)      {          throw new Exception(e.Message);      }      prograssBar.ShowDialog();      return directory;  }  

I don't have a clue why this throw to me an exception, I tried to download other files, http and https, it doesn't seams to have any difference to the outcome.

The given exception:

System.Exception    HResult=0x80131500    Message=An exception occurred during WebClient request.    Source=PulserTesterMultipleHeads    StackTrace:     at PulserTesterMultipleHeads.Models.MainTestPageMV.LocalSqlDriverDownloader() in C:\Project\Models\MainTestPageMV.cs:line 955     at PulserTesterMultipleHeads.Models.MainTestPageMV.LocalSQLDriverInstaller() in C:\Project\Models\MainTestPageMV.cs:line 905     at PulserTesterMultipleHeads.Models.MainTestPageMV..ctor(Action closeAction, String catalogDesc) in C:\Project\Models\MainTestPageMV.cs:line 70     at PulserTesterMultipleHeads.UserControls.MainTestPage..ctor() in C:\Project\UserControls\MainTestPage.xaml.cs:line 31  

No matching distribution found for opencv-python

Posted: 15 Aug 2021 08:04 AM PDT

I have python 2.7 and when I try to install opencv I get error

pip install opencv-python    Looking in indexes: http://pypi.dd.dynamore/simple, https://pypi.python.org/simple/  Collecting opencv-python    ERROR: Could not find a version that satisfies the requirement opencv-python (from versions: none)  ERROR: No matching distribution found for opencv-python  

I have the latest pip version (19.2.3). And I have also installed this

sudo apt-get install python-opencv  

npm : Depends: node-gyp (>= 3.6.2~) but it is not going to be installed

Posted: 15 Aug 2021 08:05 AM PDT

I did the following steps:

git clone https://github.com/nibtehaz/NORTH-app.git  curl -sL https://deb.nodesource.com/setup_10.x | sudo -E bash -  sudo apt-get install -y nodejs  npm install # which caused the following error  

and got

The following packages have unmet dependencies:   nodejs : Conflicts: npm   npm : Depends: node-gyp (>= 3.6.2~) but it is not going to be installed  E: Unable to correct problems, you have held broken packages.  

I did what was suggested, but it lead to another problem:

sudo apt install node-gyp  Reading package lists... Done  Building dependency tree         Reading state information... Done  Some packages could not be installed. This may mean that you have  requested an impossible situation or if you are using the unstable  distribution that some required packages have not yet been created  or been moved out of Incoming.  The following information may help to resolve the situation:    The following packages have unmet dependencies:   node-gyp : Depends: libnode-dev but it is not going to be installed  E: Unable to correct problems, you have held broken packages  

What did I miss?

Unable to install psycopg2 (pip install psycopg2)

Posted: 15 Aug 2021 08:05 AM PDT

I'm using MAC and python version 2.7.14

Collecting psycopg2    Could not fetch URL https://pypi.python.org/simple/psycopg2/: There was a problem confirming the ssl certificate: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:661) - skipping    Could not find a version that satisfies the requirement psycopg2 (from versions: )  No matching distribution found for psycopg2  

codesign_allocate: error: unable to find utility "codesign_allocate", not a developer tool or in PATH

Posted: 15 Aug 2021 08:04 AM PDT

I upgraded my Mac to 10.10.3 and Xcode to 6.3.1 yesterday; today I can't use git, codesign_allocate....and more.

`sh: line 1: 18421 Abort trap: 6           /Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild -sdk /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.3.sdk -find codesign_allocate 2> /dev/null  codesign_allocate: error: unable to find utility "codesign_allocate", not a developer tool or in PATH  /Users/ylx/Library/Developer/Xcode/DerivedData/test-ahpvhmtqeafuveekexklyxulvmfa/Build/Products/Debug-iphoneos/test.app/Frameworks/libswiftCore.dylib: the codesign_allocate helper tool cannot be found or used  *** error: Couldn't codesign /Users/ylx/Library/Developer/Xcode/DerivedData/test-ahpvhmtqeafuveekexklyxulvmfa/Build/Products/Debug-iphoneos/test.app/Frameworks/libswiftCore.dylib: codesign failed with exit code 1  Command /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/swift-stdlib-tool failed with exit code 1`  

No comments:

Post a Comment