kandi background
Explore Kits

Notable | Notable is an app for Android notification reminders | Notification library

 by   icechen1 Java Version: Current License: No License

 by   icechen1 Java Version: Current License: No License

Download this library from

kandi X-RAY | Notable Summary

Notable is a Java library typically used in Messaging, Notification applications. Notable has no bugs, it has build file available and it has low support. However Notable has 1 vulnerabilities. You can download it from GitHub.
Notable is an app for Android notification reminders.
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • Notable has a low active ecosystem.
  • It has 52 star(s) with 14 fork(s). There are 4 watchers for this library.
  • It had no major release in the last 12 months.
  • There are 10 open issues and 0 have been closed. On average issues are closed in 1254 days. There are 2 open pull requests and 0 closed requests.
  • It has a neutral sentiment in the developer community.
  • The latest version of Notable is current.
Notable Support
Best in #Notification
Average in #Notification
Notable Support
Best in #Notification
Average in #Notification

quality kandi Quality

  • Notable has 0 bugs and 0 code smells.
Notable Quality
Best in #Notification
Average in #Notification
Notable Quality
Best in #Notification
Average in #Notification

securitySecurity

  • Notable has 1 vulnerability issues reported (1 critical, 0 high, 0 medium, 0 low).
  • Notable code analysis shows 0 unresolved vulnerabilities.
  • There are 0 security hotspots that need review.
Notable Security
Best in #Notification
Average in #Notification
Notable Security
Best in #Notification
Average in #Notification

license License

  • Notable does not have a standard license declared.
  • Check the repository for any license declaration and review the terms closely.
  • Without a license, all rights are reserved, and you cannot use the library in your applications.
Notable License
Best in #Notification
Average in #Notification
Notable License
Best in #Notification
Average in #Notification

buildReuse

  • Notable releases are not available. You will need to build from source code and install.
  • Build file is available. You can build the component from source.
Notable Reuse
Best in #Notification
Average in #Notification
Notable Reuse
Best in #Notification
Average in #Notification
Top functions reviewed by kandi - BETA

kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample Here

Get all kandi verified functions for this library.

Get all kandi verified functions for this library.

Notable Key Features

Notable is an app for Android notification reminders

How can a simple date variable hurt my runperformance?

copy iconCopydownload iconDownload
    OPTION( RECOMPILE )
    OPTION( OPTIMIZE FOR (  @datetouse =  '2022-03-05' ) )
    OPTION( RECOMPILE )
    OPTION( OPTIMIZE FOR (  @datetouse =  '2022-03-05' ) )

C Programming - Space Character Not Detected

copy iconCopydownload iconDownload
if(strstr(delims, &inText[i]) != NULL)
i = strcspn( &inText[i], delims );
size_t n = strcspn( &inText[i], delims );
if(strchr( delims, inText[i]) != NULL)
if(strstr(delims, &inText[i]) != NULL)
i = strcspn( &inText[i], delims );
size_t n = strcspn( &inText[i], delims );
if(strchr( delims, inText[i]) != NULL)
if(strstr(delims, &inText[i]) != NULL)
i = strcspn( &inText[i], delims );
size_t n = strcspn( &inText[i], delims );
if(strchr( delims, inText[i]) != NULL)
if(strstr(delims, &inText[i]) != NULL)
i = strcspn( &inText[i], delims );
size_t n = strcspn( &inText[i], delims );
if(strchr( delims, inText[i]) != NULL)

Is there a way to set a vuetify data table page before the table is loaded?

copy iconCopydownload iconDownload
const p = this.page
this.page = 1
this.$nextTick(() => {
  this.page = p
})

Compress & Upload large videos to Google cloud storage using Flutter/Dart

copy iconCopydownload iconDownload
dependencies:
  camera: ^0.9.4+12
  flutter_riverpod: ^1.0.3
  ffmpeg_kit_flutter_full_gpl: ^4.5.1
  wakelock: ^0.5.6
@immutable
class TranscodeUploadMessage {
  const TranscodeUploadMessage({
    required this.id,
    required this.statusTitle,
    required this.statusMessage,
    required this.uploadPercentage,
    required this.isRunning,
    required this.completed,
    required this.showSpinner,
    required this.showPercentage,
    required this.showError,
  });

  final int id;
  final String statusTitle;
  final String statusMessage;
  final String uploadPercentage;
  final bool isRunning;
  final bool completed;
  final bool showSpinner;
  final bool showPercentage;
  final bool showError;

  TranscodeUploadMessage copyWith({
    int? id,
    String? statusTitle,
    String? statusMessage,
    String? uploadPercentage,
    bool? isRunning,
    bool? completed,
    bool? showSpinner,
    bool? showPercentage,
    bool? showError,
  }) {
    return TranscodeUploadMessage(
      id: id ?? this.id,
      statusTitle: statusTitle ?? this.statusTitle,
      statusMessage: statusMessage ?? this.statusMessage,
      uploadPercentage: uploadPercentage ?? this.uploadPercentage,
      isRunning: isRunning ?? this.isRunning,
      completed: completed ?? this.completed,
      showSpinner: showSpinner ?? this.showSpinner,
      showPercentage: showSpinner ?? this.showPercentage,
      showError: showError ?? this.showError,
    );
  }
}

class TranscodeUploadMessageNotifier
    extends StateNotifier<List<TranscodeUploadMessage>> {
  TranscodeUploadMessageNotifier() : super([]);

  /// Since our state is immutable, we are not allowed to do
  /// `state.add(message)`. Instead, we should create a new list of messages which
  /// contains the previous items and the new one.
  ///
  /// Using Dart's spread operator here is helpful!
  void set(TranscodeUploadMessage message) {
    state = [...state, message];
  }

  /// Our state is immutable. So we're making a new list instead of changing
  /// the existing list.
  void remove(int id) {
    state = [
      for (final message in state)
        if (message.id != id) message,
    ];
  }

  /// Update message. Since our state is immutable, we need to make a copy of
  /// the message. We're using our `copyWith` method implemented before to help
  /// with that.
  void update(TranscodeUploadMessage messageUpdated) {
    state = [
      for (final message in state)
        if (message.id == messageUpdated.id)

          /// Use copyWith to update a message
          message.copyWith(
            statusTitle: messageUpdated.statusTitle,
            statusMessage: messageUpdated.statusMessage,
            uploadPercentage: messageUpdated.uploadPercentage,
            isRunning: messageUpdated.isRunning,
            completed: messageUpdated.completed,
            showSpinner: messageUpdated.showSpinner,
            showPercentage: messageUpdated.showPercentage,
            showError: messageUpdated.showError,
          )
        else

          /// other messages, which there are not any at this time, are not
          /// modified
          message,
    ];
  }
}

/// Using StateNotifierProvider to allow the UI to interact with our
/// TranscodeUploadMessageNotifier class.
final transcodeMessageProvider = StateNotifierProvider.autoDispose<
    TranscodeUploadMessageNotifier, List<TranscodeUploadMessage>>((ref) {
  return TranscodeUploadMessageNotifier();
});
/// By default, set to video ffmpeg command.
String ffmpegCommand = '-i $messageUri '
    '-acodec aac '
    '-vcodec libx264 '
    '-f mp4 -preset veryfast '
    '-movflags frag_keyframe+empty_moov '
    '-crf 23 $newMessageUri';

if (_recordingType == RecordingType.audio) {
  ffmpegCommand = '-vn '
      '-i $messageUri '
      '-y '
      '-acodec libmp3lame '
      '-f '
      'mp3 '
      '$newMessageUri';
}

/// Set the initial state notifier as we start transcoding.
ref
.read(transcodeMessageProvider.notifier)
.set(const TranscodeUploadMessage(
  id: 1,
  statusTitle: 'Transcoding Recording',
  statusMessage: 'Your recording is being transcoded '
      'before upload. Please do not navigate away from this screen.',
  uploadPercentage: '0%',
  isRunning: true,
  completed: false,
  showSpinner: false,
  showPercentage: false,
  showError: false,
));

await FFmpegKit.executeAsync(
  ffmpegCommand,
  (Session session) async {
    final ReturnCode? returnCode = await session.getReturnCode();

    if (ReturnCode.isSuccess(returnCode)) {
      /// Transcoding is complete, now display uploading message
      /// and spinner at 0%.
      ref
          .read(transcodeMessageProvider.notifier)
          .update(const TranscodeUploadMessage(
            id: 1,
            statusTitle: 'Uploading Recording',
            statusMessage:
                'Your recording is now being '
                'uploaded. Please do not navigate away from this screen.',
            uploadPercentage: '0%',
            isRunning: true,
            completed: false,
            showSpinner: true,
            showPercentage: true,
            showError: false,
          ));

      /// Upload the now transcoded video/audio to cloud storage where
      /// Use flutterfire firebase storage tasks to get upload
      /// progress. Your firebase storage function can also 
      /// reuse the transcodeMessageProvider to send UI state
      /// updates for the upload, which will happen very quickly
      /// even on slow connections now that the recording size
      /// is dramatically reduced.
      await uploadRecordingToFirebaseCloudStorage(ref);
    } else if (ReturnCode.isCancel(returnCode)) {
      // Do something if canceled
    } else {
      // Do something with the error
    }
  },
  (Log log) => debugPrint(log.getMessage()),
  (Statistics statistic) {
    /// Statistics provides a running transcoding progress meter.
    int completePercentage = (statistic.getTime() * 100) ~/ _duration!;
    ref
        .read(transcodeMessageProvider.notifier)
        .update(TranscodeUploadMessage(
          id: 1,
          statusTitle: 'Transcoding Recording',
          statusMessage: 'Your recording is being '
              'transcoded. Please do not navigate away from this screen.',
          uploadPercentage: '$completePercentage%',
          isRunning: true,
          completed: false,
          showSpinner: true,
          showPercentage: true,
          showError: false,
        ));
  }).then((Session session) {
debugPrint(
    'Async FFmpeg process started with sessionId ${session.getSessionId()}.');
}).catchError((error) async {
debugPrint('transcoding error: $error');
});
Consumer(
    builder: (context, watch, child) {
      final List<TranscodeUploadMessage> messages =
          ref.watch(transcodeMessageProvider);
      if (messages.isEmpty) {
        return const SizedBox.shrink();
      }

      final message = messages[0];

      if (message.isRunning ||
          message.completed ||
          message.showError) {
        // Display widgets with StateNotifier data
      }

      return const SizedBox.shrink();
    },
)
dependencies:
  camera: ^0.9.4+12
  flutter_riverpod: ^1.0.3
  ffmpeg_kit_flutter_full_gpl: ^4.5.1
  wakelock: ^0.5.6
@immutable
class TranscodeUploadMessage {
  const TranscodeUploadMessage({
    required this.id,
    required this.statusTitle,
    required this.statusMessage,
    required this.uploadPercentage,
    required this.isRunning,
    required this.completed,
    required this.showSpinner,
    required this.showPercentage,
    required this.showError,
  });

  final int id;
  final String statusTitle;
  final String statusMessage;
  final String uploadPercentage;
  final bool isRunning;
  final bool completed;
  final bool showSpinner;
  final bool showPercentage;
  final bool showError;

  TranscodeUploadMessage copyWith({
    int? id,
    String? statusTitle,
    String? statusMessage,
    String? uploadPercentage,
    bool? isRunning,
    bool? completed,
    bool? showSpinner,
    bool? showPercentage,
    bool? showError,
  }) {
    return TranscodeUploadMessage(
      id: id ?? this.id,
      statusTitle: statusTitle ?? this.statusTitle,
      statusMessage: statusMessage ?? this.statusMessage,
      uploadPercentage: uploadPercentage ?? this.uploadPercentage,
      isRunning: isRunning ?? this.isRunning,
      completed: completed ?? this.completed,
      showSpinner: showSpinner ?? this.showSpinner,
      showPercentage: showSpinner ?? this.showPercentage,
      showError: showError ?? this.showError,
    );
  }
}

class TranscodeUploadMessageNotifier
    extends StateNotifier<List<TranscodeUploadMessage>> {
  TranscodeUploadMessageNotifier() : super([]);

  /// Since our state is immutable, we are not allowed to do
  /// `state.add(message)`. Instead, we should create a new list of messages which
  /// contains the previous items and the new one.
  ///
  /// Using Dart's spread operator here is helpful!
  void set(TranscodeUploadMessage message) {
    state = [...state, message];
  }

  /// Our state is immutable. So we're making a new list instead of changing
  /// the existing list.
  void remove(int id) {
    state = [
      for (final message in state)
        if (message.id != id) message,
    ];
  }

  /// Update message. Since our state is immutable, we need to make a copy of
  /// the message. We're using our `copyWith` method implemented before to help
  /// with that.
  void update(TranscodeUploadMessage messageUpdated) {
    state = [
      for (final message in state)
        if (message.id == messageUpdated.id)

          /// Use copyWith to update a message
          message.copyWith(
            statusTitle: messageUpdated.statusTitle,
            statusMessage: messageUpdated.statusMessage,
            uploadPercentage: messageUpdated.uploadPercentage,
            isRunning: messageUpdated.isRunning,
            completed: messageUpdated.completed,
            showSpinner: messageUpdated.showSpinner,
            showPercentage: messageUpdated.showPercentage,
            showError: messageUpdated.showError,
          )
        else

          /// other messages, which there are not any at this time, are not
          /// modified
          message,
    ];
  }
}

/// Using StateNotifierProvider to allow the UI to interact with our
/// TranscodeUploadMessageNotifier class.
final transcodeMessageProvider = StateNotifierProvider.autoDispose<
    TranscodeUploadMessageNotifier, List<TranscodeUploadMessage>>((ref) {
  return TranscodeUploadMessageNotifier();
});
/// By default, set to video ffmpeg command.
String ffmpegCommand = '-i $messageUri '
    '-acodec aac '
    '-vcodec libx264 '
    '-f mp4 -preset veryfast '
    '-movflags frag_keyframe+empty_moov '
    '-crf 23 $newMessageUri';

if (_recordingType == RecordingType.audio) {
  ffmpegCommand = '-vn '
      '-i $messageUri '
      '-y '
      '-acodec libmp3lame '
      '-f '
      'mp3 '
      '$newMessageUri';
}

/// Set the initial state notifier as we start transcoding.
ref
.read(transcodeMessageProvider.notifier)
.set(const TranscodeUploadMessage(
  id: 1,
  statusTitle: 'Transcoding Recording',
  statusMessage: 'Your recording is being transcoded '
      'before upload. Please do not navigate away from this screen.',
  uploadPercentage: '0%',
  isRunning: true,
  completed: false,
  showSpinner: false,
  showPercentage: false,
  showError: false,
));

await FFmpegKit.executeAsync(
  ffmpegCommand,
  (Session session) async {
    final ReturnCode? returnCode = await session.getReturnCode();

    if (ReturnCode.isSuccess(returnCode)) {
      /// Transcoding is complete, now display uploading message
      /// and spinner at 0%.
      ref
          .read(transcodeMessageProvider.notifier)
          .update(const TranscodeUploadMessage(
            id: 1,
            statusTitle: 'Uploading Recording',
            statusMessage:
                'Your recording is now being '
                'uploaded. Please do not navigate away from this screen.',
            uploadPercentage: '0%',
            isRunning: true,
            completed: false,
            showSpinner: true,
            showPercentage: true,
            showError: false,
          ));

      /// Upload the now transcoded video/audio to cloud storage where
      /// Use flutterfire firebase storage tasks to get upload
      /// progress. Your firebase storage function can also 
      /// reuse the transcodeMessageProvider to send UI state
      /// updates for the upload, which will happen very quickly
      /// even on slow connections now that the recording size
      /// is dramatically reduced.
      await uploadRecordingToFirebaseCloudStorage(ref);
    } else if (ReturnCode.isCancel(returnCode)) {
      // Do something if canceled
    } else {
      // Do something with the error
    }
  },
  (Log log) => debugPrint(log.getMessage()),
  (Statistics statistic) {
    /// Statistics provides a running transcoding progress meter.
    int completePercentage = (statistic.getTime() * 100) ~/ _duration!;
    ref
        .read(transcodeMessageProvider.notifier)
        .update(TranscodeUploadMessage(
          id: 1,
          statusTitle: 'Transcoding Recording',
          statusMessage: 'Your recording is being '
              'transcoded. Please do not navigate away from this screen.',
          uploadPercentage: '$completePercentage%',
          isRunning: true,
          completed: false,
          showSpinner: true,
          showPercentage: true,
          showError: false,
        ));
  }).then((Session session) {
debugPrint(
    'Async FFmpeg process started with sessionId ${session.getSessionId()}.');
}).catchError((error) async {
debugPrint('transcoding error: $error');
});
Consumer(
    builder: (context, watch, child) {
      final List<TranscodeUploadMessage> messages =
          ref.watch(transcodeMessageProvider);
      if (messages.isEmpty) {
        return const SizedBox.shrink();
      }

      final message = messages[0];

      if (message.isRunning ||
          message.completed ||
          message.showError) {
        // Display widgets with StateNotifier data
      }

      return const SizedBox.shrink();
    },
)
dependencies:
  camera: ^0.9.4+12
  flutter_riverpod: ^1.0.3
  ffmpeg_kit_flutter_full_gpl: ^4.5.1
  wakelock: ^0.5.6
@immutable
class TranscodeUploadMessage {
  const TranscodeUploadMessage({
    required this.id,
    required this.statusTitle,
    required this.statusMessage,
    required this.uploadPercentage,
    required this.isRunning,
    required this.completed,
    required this.showSpinner,
    required this.showPercentage,
    required this.showError,
  });

  final int id;
  final String statusTitle;
  final String statusMessage;
  final String uploadPercentage;
  final bool isRunning;
  final bool completed;
  final bool showSpinner;
  final bool showPercentage;
  final bool showError;

  TranscodeUploadMessage copyWith({
    int? id,
    String? statusTitle,
    String? statusMessage,
    String? uploadPercentage,
    bool? isRunning,
    bool? completed,
    bool? showSpinner,
    bool? showPercentage,
    bool? showError,
  }) {
    return TranscodeUploadMessage(
      id: id ?? this.id,
      statusTitle: statusTitle ?? this.statusTitle,
      statusMessage: statusMessage ?? this.statusMessage,
      uploadPercentage: uploadPercentage ?? this.uploadPercentage,
      isRunning: isRunning ?? this.isRunning,
      completed: completed ?? this.completed,
      showSpinner: showSpinner ?? this.showSpinner,
      showPercentage: showSpinner ?? this.showPercentage,
      showError: showError ?? this.showError,
    );
  }
}

class TranscodeUploadMessageNotifier
    extends StateNotifier<List<TranscodeUploadMessage>> {
  TranscodeUploadMessageNotifier() : super([]);

  /// Since our state is immutable, we are not allowed to do
  /// `state.add(message)`. Instead, we should create a new list of messages which
  /// contains the previous items and the new one.
  ///
  /// Using Dart's spread operator here is helpful!
  void set(TranscodeUploadMessage message) {
    state = [...state, message];
  }

  /// Our state is immutable. So we're making a new list instead of changing
  /// the existing list.
  void remove(int id) {
    state = [
      for (final message in state)
        if (message.id != id) message,
    ];
  }

  /// Update message. Since our state is immutable, we need to make a copy of
  /// the message. We're using our `copyWith` method implemented before to help
  /// with that.
  void update(TranscodeUploadMessage messageUpdated) {
    state = [
      for (final message in state)
        if (message.id == messageUpdated.id)

          /// Use copyWith to update a message
          message.copyWith(
            statusTitle: messageUpdated.statusTitle,
            statusMessage: messageUpdated.statusMessage,
            uploadPercentage: messageUpdated.uploadPercentage,
            isRunning: messageUpdated.isRunning,
            completed: messageUpdated.completed,
            showSpinner: messageUpdated.showSpinner,
            showPercentage: messageUpdated.showPercentage,
            showError: messageUpdated.showError,
          )
        else

          /// other messages, which there are not any at this time, are not
          /// modified
          message,
    ];
  }
}

/// Using StateNotifierProvider to allow the UI to interact with our
/// TranscodeUploadMessageNotifier class.
final transcodeMessageProvider = StateNotifierProvider.autoDispose<
    TranscodeUploadMessageNotifier, List<TranscodeUploadMessage>>((ref) {
  return TranscodeUploadMessageNotifier();
});
/// By default, set to video ffmpeg command.
String ffmpegCommand = '-i $messageUri '
    '-acodec aac '
    '-vcodec libx264 '
    '-f mp4 -preset veryfast '
    '-movflags frag_keyframe+empty_moov '
    '-crf 23 $newMessageUri';

if (_recordingType == RecordingType.audio) {
  ffmpegCommand = '-vn '
      '-i $messageUri '
      '-y '
      '-acodec libmp3lame '
      '-f '
      'mp3 '
      '$newMessageUri';
}

/// Set the initial state notifier as we start transcoding.
ref
.read(transcodeMessageProvider.notifier)
.set(const TranscodeUploadMessage(
  id: 1,
  statusTitle: 'Transcoding Recording',
  statusMessage: 'Your recording is being transcoded '
      'before upload. Please do not navigate away from this screen.',
  uploadPercentage: '0%',
  isRunning: true,
  completed: false,
  showSpinner: false,
  showPercentage: false,
  showError: false,
));

await FFmpegKit.executeAsync(
  ffmpegCommand,
  (Session session) async {
    final ReturnCode? returnCode = await session.getReturnCode();

    if (ReturnCode.isSuccess(returnCode)) {
      /// Transcoding is complete, now display uploading message
      /// and spinner at 0%.
      ref
          .read(transcodeMessageProvider.notifier)
          .update(const TranscodeUploadMessage(
            id: 1,
            statusTitle: 'Uploading Recording',
            statusMessage:
                'Your recording is now being '
                'uploaded. Please do not navigate away from this screen.',
            uploadPercentage: '0%',
            isRunning: true,
            completed: false,
            showSpinner: true,
            showPercentage: true,
            showError: false,
          ));

      /// Upload the now transcoded video/audio to cloud storage where
      /// Use flutterfire firebase storage tasks to get upload
      /// progress. Your firebase storage function can also 
      /// reuse the transcodeMessageProvider to send UI state
      /// updates for the upload, which will happen very quickly
      /// even on slow connections now that the recording size
      /// is dramatically reduced.
      await uploadRecordingToFirebaseCloudStorage(ref);
    } else if (ReturnCode.isCancel(returnCode)) {
      // Do something if canceled
    } else {
      // Do something with the error
    }
  },
  (Log log) => debugPrint(log.getMessage()),
  (Statistics statistic) {
    /// Statistics provides a running transcoding progress meter.
    int completePercentage = (statistic.getTime() * 100) ~/ _duration!;
    ref
        .read(transcodeMessageProvider.notifier)
        .update(TranscodeUploadMessage(
          id: 1,
          statusTitle: 'Transcoding Recording',
          statusMessage: 'Your recording is being '
              'transcoded. Please do not navigate away from this screen.',
          uploadPercentage: '$completePercentage%',
          isRunning: true,
          completed: false,
          showSpinner: true,
          showPercentage: true,
          showError: false,
        ));
  }).then((Session session) {
debugPrint(
    'Async FFmpeg process started with sessionId ${session.getSessionId()}.');
}).catchError((error) async {
debugPrint('transcoding error: $error');
});
Consumer(
    builder: (context, watch, child) {
      final List<TranscodeUploadMessage> messages =
          ref.watch(transcodeMessageProvider);
      if (messages.isEmpty) {
        return const SizedBox.shrink();
      }

      final message = messages[0];

      if (message.isRunning ||
          message.completed ||
          message.showError) {
        // Display widgets with StateNotifier data
      }

      return const SizedBox.shrink();
    },
)
dependencies:
  camera: ^0.9.4+12
  flutter_riverpod: ^1.0.3
  ffmpeg_kit_flutter_full_gpl: ^4.5.1
  wakelock: ^0.5.6
@immutable
class TranscodeUploadMessage {
  const TranscodeUploadMessage({
    required this.id,
    required this.statusTitle,
    required this.statusMessage,
    required this.uploadPercentage,
    required this.isRunning,
    required this.completed,
    required this.showSpinner,
    required this.showPercentage,
    required this.showError,
  });

  final int id;
  final String statusTitle;
  final String statusMessage;
  final String uploadPercentage;
  final bool isRunning;
  final bool completed;
  final bool showSpinner;
  final bool showPercentage;
  final bool showError;

  TranscodeUploadMessage copyWith({
    int? id,
    String? statusTitle,
    String? statusMessage,
    String? uploadPercentage,
    bool? isRunning,
    bool? completed,
    bool? showSpinner,
    bool? showPercentage,
    bool? showError,
  }) {
    return TranscodeUploadMessage(
      id: id ?? this.id,
      statusTitle: statusTitle ?? this.statusTitle,
      statusMessage: statusMessage ?? this.statusMessage,
      uploadPercentage: uploadPercentage ?? this.uploadPercentage,
      isRunning: isRunning ?? this.isRunning,
      completed: completed ?? this.completed,
      showSpinner: showSpinner ?? this.showSpinner,
      showPercentage: showSpinner ?? this.showPercentage,
      showError: showError ?? this.showError,
    );
  }
}

class TranscodeUploadMessageNotifier
    extends StateNotifier<List<TranscodeUploadMessage>> {
  TranscodeUploadMessageNotifier() : super([]);

  /// Since our state is immutable, we are not allowed to do
  /// `state.add(message)`. Instead, we should create a new list of messages which
  /// contains the previous items and the new one.
  ///
  /// Using Dart's spread operator here is helpful!
  void set(TranscodeUploadMessage message) {
    state = [...state, message];
  }

  /// Our state is immutable. So we're making a new list instead of changing
  /// the existing list.
  void remove(int id) {
    state = [
      for (final message in state)
        if (message.id != id) message,
    ];
  }

  /// Update message. Since our state is immutable, we need to make a copy of
  /// the message. We're using our `copyWith` method implemented before to help
  /// with that.
  void update(TranscodeUploadMessage messageUpdated) {
    state = [
      for (final message in state)
        if (message.id == messageUpdated.id)

          /// Use copyWith to update a message
          message.copyWith(
            statusTitle: messageUpdated.statusTitle,
            statusMessage: messageUpdated.statusMessage,
            uploadPercentage: messageUpdated.uploadPercentage,
            isRunning: messageUpdated.isRunning,
            completed: messageUpdated.completed,
            showSpinner: messageUpdated.showSpinner,
            showPercentage: messageUpdated.showPercentage,
            showError: messageUpdated.showError,
          )
        else

          /// other messages, which there are not any at this time, are not
          /// modified
          message,
    ];
  }
}

/// Using StateNotifierProvider to allow the UI to interact with our
/// TranscodeUploadMessageNotifier class.
final transcodeMessageProvider = StateNotifierProvider.autoDispose<
    TranscodeUploadMessageNotifier, List<TranscodeUploadMessage>>((ref) {
  return TranscodeUploadMessageNotifier();
});
/// By default, set to video ffmpeg command.
String ffmpegCommand = '-i $messageUri '
    '-acodec aac '
    '-vcodec libx264 '
    '-f mp4 -preset veryfast '
    '-movflags frag_keyframe+empty_moov '
    '-crf 23 $newMessageUri';

if (_recordingType == RecordingType.audio) {
  ffmpegCommand = '-vn '
      '-i $messageUri '
      '-y '
      '-acodec libmp3lame '
      '-f '
      'mp3 '
      '$newMessageUri';
}

/// Set the initial state notifier as we start transcoding.
ref
.read(transcodeMessageProvider.notifier)
.set(const TranscodeUploadMessage(
  id: 1,
  statusTitle: 'Transcoding Recording',
  statusMessage: 'Your recording is being transcoded '
      'before upload. Please do not navigate away from this screen.',
  uploadPercentage: '0%',
  isRunning: true,
  completed: false,
  showSpinner: false,
  showPercentage: false,
  showError: false,
));

await FFmpegKit.executeAsync(
  ffmpegCommand,
  (Session session) async {
    final ReturnCode? returnCode = await session.getReturnCode();

    if (ReturnCode.isSuccess(returnCode)) {
      /// Transcoding is complete, now display uploading message
      /// and spinner at 0%.
      ref
          .read(transcodeMessageProvider.notifier)
          .update(const TranscodeUploadMessage(
            id: 1,
            statusTitle: 'Uploading Recording',
            statusMessage:
                'Your recording is now being '
                'uploaded. Please do not navigate away from this screen.',
            uploadPercentage: '0%',
            isRunning: true,
            completed: false,
            showSpinner: true,
            showPercentage: true,
            showError: false,
          ));

      /// Upload the now transcoded video/audio to cloud storage where
      /// Use flutterfire firebase storage tasks to get upload
      /// progress. Your firebase storage function can also 
      /// reuse the transcodeMessageProvider to send UI state
      /// updates for the upload, which will happen very quickly
      /// even on slow connections now that the recording size
      /// is dramatically reduced.
      await uploadRecordingToFirebaseCloudStorage(ref);
    } else if (ReturnCode.isCancel(returnCode)) {
      // Do something if canceled
    } else {
      // Do something with the error
    }
  },
  (Log log) => debugPrint(log.getMessage()),
  (Statistics statistic) {
    /// Statistics provides a running transcoding progress meter.
    int completePercentage = (statistic.getTime() * 100) ~/ _duration!;
    ref
        .read(transcodeMessageProvider.notifier)
        .update(TranscodeUploadMessage(
          id: 1,
          statusTitle: 'Transcoding Recording',
          statusMessage: 'Your recording is being '
              'transcoded. Please do not navigate away from this screen.',
          uploadPercentage: '$completePercentage%',
          isRunning: true,
          completed: false,
          showSpinner: true,
          showPercentage: true,
          showError: false,
        ));
  }).then((Session session) {
debugPrint(
    'Async FFmpeg process started with sessionId ${session.getSessionId()}.');
}).catchError((error) async {
debugPrint('transcoding error: $error');
});
Consumer(
    builder: (context, watch, child) {
      final List<TranscodeUploadMessage> messages =
          ref.watch(transcodeMessageProvider);
      if (messages.isEmpty) {
        return const SizedBox.shrink();
      }

      final message = messages[0];

      if (message.isRunning ||
          message.completed ||
          message.showError) {
        // Display widgets with StateNotifier data
      }

      return const SizedBox.shrink();
    },
)

In JavaScript, I cannot successfully access a randomly-selected property from my custom object class Question using dot notation

copy iconCopydownload iconDownload
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj.currProperty;
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj[currProperty];
var home = {
  color: "blue",
  occupied: true,
  property: "apartment"
}

var property = "color";

console.log("Home.Property: ", home.property)
console.log("Home[Property]: ", home[property])
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj.currProperty;
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj[currProperty];
var home = {
  color: "blue",
  occupied: true,
  property: "apartment"
}

var property = "color";

console.log("Home.Property: ", home.property)
console.log("Home[Property]: ", home[property])
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj.currProperty;
 currProperty = propertiesArray[someRandomNum]
 console.log(currObj, currProperty);
 choiceListItem.innerText = currObj[currProperty];
var home = {
  color: "blue",
  occupied: true,
  property: "apartment"
}

var property = "color";

console.log("Home.Property: ", home.property)
console.log("Home[Property]: ", home[property])

Merging of two files with substitution

copy iconCopydownload iconDownload
sed -e '${r file2' -e ';$d;}' file1

string 1
string 2
string N
string 3
string 4
string M
sed -i.bak -e "\${r $f2" -e ';$d;}' "$f1"
sed -e '${r file2' -e ';$d;}' file1

string 1
string 2
string N
string 3
string 4
string M
sed -i.bak -e "\${r $f2" -e ';$d;}' "$f1"
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file file
string 1
string 2
END
string N
string 1
string 2
END
string N
END
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file1 file2
string 1
string 2
string N
string 3
string 4
string M
% cat file
string 1
string 2
END
string N
END

% cat file1
string 1
string 2
string N
END

% cat file2
string 3
string 4
string M
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file file
string 1
string 2
END
string N
string 1
string 2
END
string N
END
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file1 file2
string 1
string 2
string N
string 3
string 4
string M
% cat file
string 1
string 2
END
string N
END

% cat file1
string 1
string 2
string N
END

% cat file2
string 3
string 4
string M
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file file
string 1
string 2
END
string N
string 1
string 2
END
string N
END
% awk 'set==""&&NR!=FNR&&last=="END"{last="";set=1} 
    last!=""{print last} 
    {last=$0} END{print}' file1 file2
string 1
string 2
string N
string 3
string 4
string M
% cat file
string 1
string 2
END
string N
END

% cat file1
string 1
string 2
string N
END

% cat file2
string 3
string 4
string M
$ cat <(grep -v '^END$' file1) file2 > file3
$ cat file3
string 1
string 2
string N
string 3
string 4
string M
$ awk 'FNR==NR && /^END$/ {next} 1' file1 file2 > file3
$ cat file3
string 1
string 2
string N
string 3
string 4
string M
$ cat <(grep -v '^END$' file1) file2 > file3
$ cat file3
string 1
string 2
string N
string 3
string 4
string M
$ awk 'FNR==NR && /^END$/ {next} 1' file1 file2 > file3
$ cat file3
string 1
string 2
string N
string 3
string 4
string M
head -n-1 file1 | cat - file2 > file3
sed -i -e '$r file2' -e '$d' file1
head -n-1 file1 | cat - file2 > file3
sed -i -e '$r file2' -e '$d' file1
tac file1 | awk 'FNR==1{system("tac file2");next} 1' | tac
tac file1 |               ##using tac command to read contents from bottom to up and sending its standard output as standard input to awk command here.
awk '                     ##Starting awk program from here.
  FNR==1{                 ##Checking condition if this is first line then do following.
    system("tac file2")   ##Printing file2 bottom to up contents.
    next                  ##next will skip all further contents from here.
  }
  1                       ##Printing rest of lines(passed from tac file1) here.
' | tac                   ##Reversing order again to make it into original order.
tac file1 | awk 'FNR==1{system("tac file2");next} 1' | tac
tac file1 |               ##using tac command to read contents from bottom to up and sending its standard output as standard input to awk command here.
awk '                     ##Starting awk program from here.
  FNR==1{                 ##Checking condition if this is first line then do following.
    system("tac file2")   ##Printing file2 bottom to up contents.
    next                  ##next will skip all further contents from here.
  }
  1                       ##Printing rest of lines(passed from tac file1) here.
' | tac                   ##Reversing order again to make it into original order.

Tensorflow tf.data.Dataset.cache seems do not take the expected effect

copy iconCopydownload iconDownload
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>
import random
import struct
import tensorflow as tf
import numpy as np

RAW_N = 2 + 20*20 + 1

bytess = random.sample(range(1, 5000), RAW_N*4)
with open('mydata.bin', 'wb') as f:
  f.write(struct.pack('1612i', *bytess))
def decode_and_prepare(register):
  register = tf.io.decode_raw(register, out_type=tf.float32)
  inputs = register[2:402]
  label = tf.random.uniform(()) + register[402:]
  return inputs, label

raw_dataset = tf.data.FixedLengthRecordDataset(filenames=['/content/mydata.bin']*7000, record_bytes=RAW_N*4)
raw_dataset = raw_dataset.map(decode_and_prepare)
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 3ms/step - loss: 0.1425
Epoch 2/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 4s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41be037d0>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 4s 2ms/step - loss: 0.1428
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 3ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41fa87810>
total_data_entries = len(list(raw_dataset.map(lambda x, y: (x, y))))
train_ds = raw_dataset.shuffle(buffer_size=total_data_entries).cache().batch(32).prefetch(tf.data.AUTOTUNE)
_ = list(train_ds.as_numpy_iterator()) # iterate dataset beforehand
inputs = tf.keras.layers.Input((400,))
x = tf.keras.layers.Dense(200, activation='relu', kernel_initializer='normal')(inputs)
x = tf.keras.layers.Dense(100, activation='relu', kernel_initializer='normal')(x)
outputs = tf.keras.layers.Dense(1, kernel_initializer='normal')(x)
model = tf.keras.Model(inputs, outputs)
model.compile(optimizer='adam', loss='mse')
model.fit(train_ds, epochs=5)
Epoch 1/5
875/875 [==============================] - 3s 3ms/step - loss: 0.1427
Epoch 2/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0841
Epoch 3/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 4/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
Epoch 5/5
875/875 [==============================] - 2s 2ms/step - loss: 0.0840
<keras.callbacks.History at 0x7fc41ac9c850>

Generating UI/Server based on initial selection

copy iconCopydownload iconDownload
library(dbplyr)
library(dplyr)
library(shiny)
library(shinyWidgets)
library(DT)

df <- read.csv('https://raw.githubusercontent.com/datacfb123/testdata/main/sampleset_df.csv')
df2 <- read.csv('https://raw.githubusercontent.com/datacfb123/testdata/main/sampleset_df.csv')
df2$unique_id <- df2$unique_id*2  ##  just to check if switching works

ui <- fluidPage(
  titlePanel("Sample"),
  sidebarLayout(
    sidebarPanel(
      radioButtons("mydata", label = "Choose dataframe", choices = c("df","df2"), inline=TRUE),
      selectizeInput("data1", "Select State", choices = c(unique(df$state))),
      selectizeInput("data2", "Select County", choices = NULL),
      selectizeInput("data3", "Select City", choices = NULL, multiple = TRUE),
      selectizeInput("data4", "Select Demo", choices = c("All", unique(df$demo))),
      selectizeInput("data5", "Select Status", choices = c("All", unique(df$status))),
      sliderInput("age", label = h3("Select Age Range"), 18,
                  35, value = c(18, 20), round = TRUE, step = 1),
      sliderInput("score1", label = h3("Select Score1 Range"), min = 0,
                  max = 100, value = c(20,80)),
      conditionalPanel(condition = "input.mydata=='df'",
                       sliderInput("score2", label = h3("Select Score2 Range"), min = 0, max = 100, value = c(20,80))
                       ),
      prettyCheckboxGroup("phones", h3("Only Include Valid Phone Numbers?"), selected = "Yes", choices = list("Yes")),
      downloadButton("download", "Download Data")
    ),
    mainPanel(
      DTOutput("table")
    )
  )
)

server <- function(input, output, session){
  
  mydf <- reactive({get(input$mydata)})
  
  observeEvent(input$data1, {
    df <- mydf()
    #if (input$data1 != "All") {
      updateSelectizeInput(session, "data2", "Select County", server = TRUE, choices = c("All", unique(df$county[df$state == input$data1])))
    # } else {
    #   updateSelectizeInput(session, "data2", "Select County", server = TRUE, choices = c("All", unique(df$county)))
    # }
  }, priority = 2)

  observeEvent(c(input$data1, input$data2), {
    req(mydf())
    df <- mydf()
    if (input$data2 != "All") {
      updateSelectizeInput(session, "data3", "Select City", server = TRUE, choices = c("All", unique(df$city[df$county == input$data2])))
    } else {
      #if (input$data1 != "All") {
        updateSelectizeInput(session, "data3", "Select City", server = TRUE, choices = c("All", unique(df$city[df$state == input$data1])))
      # } else {
      #   updateSelectizeInput(session, "data3", "Select City", server = TRUE, choices = c("All", unique(df$city)))
      # }
    }
  }, priority = 1)

  filtered_data <- reactive({
    req(input$data3)
    temp_data <- mydf()
    if (input$data1 != "All") {
      temp_data <- temp_data[temp_data$state == input$data1, ]
    }
    if (input$data2 != "All") {
      temp_data <- temp_data[temp_data$county == input$data2, ]
    }
    if (input$data3 != "All") {
      temp_data <- temp_data[temp_data$city %in% input$data3, ]
    }
    if (input$data4 != "All") {
      temp_data <- temp_data[temp_data$demo %in% input$data4, ]
    }
    if (input$data5 != "All") {
      temp_data <- temp_data[temp_data$status %in% input$data5, ]
    }

    df2 <- temp_data %>% dplyr::filter(age >= input$age[1] &
                                       age <= input$age[2] &
                                       score1 >= input$score1[1] &
                                       score1 <= input$score1[2])
    if (input$mydata=="df") df2 <- df2 %>% dplyr::filter(score2 >= input$score2[1] & score2 <= input$score2[2])

    df3 <- if (is.null(input$phones)) df2 else df2 %>%  dplyr::filter(!is.na(phone))
    df3 %>% dplyr::select(unique_id, first_name, last_name, phone)
  })

  output$table <- renderDT(
    filtered_data()
  )

  output$download <- downloadHandler(
    filename = function() {
      paste("universe", "_", date(), ".csv", sep="")
    },
    content = function(file) {
      write.csv(filtered_data() %>% distinct_all(), file, row.names = FALSE)
    }
  )

}

shinyApp(ui, server)

How to focus an element only if it isn't currently focused?

copy iconCopydownload iconDownload
onMouseDown={(e) => {
  if (document.activeElement == inputEl.current) {
   e.preventDefault();
  }else{ /*Do focus on input*/ }
}}

Inconsistent inheritance of interfaces with generic classes

copy iconCopydownload iconDownload
type LiteralKeys<T> = 
  { [K in keyof T]-?: T[K] extends Literal ? K : never }[keyof T];
type EagerlyEvaluated = LiteralKeys<IPerson> // "name"
function deferred<T extends IPerson>() {
    type Deferred = LiteralKeys<T>;
    // type Deferred = { [K in keyof T]-?: T[K] extends Literal ? K : never; }[keyof T]
}
const name: Deferred = "name" // error!
export abstract class Person<T extends IPerson = IPerson> extends GraphNode<T> {
    setName(name: string) {
        (this as Person).set('name', name); // okay now
    }
}
type LiteralKeys<T> = 
  { [K in keyof T]-?: T[K] extends Literal ? K : never }[keyof T];
type EagerlyEvaluated = LiteralKeys<IPerson> // "name"
function deferred<T extends IPerson>() {
    type Deferred = LiteralKeys<T>;
    // type Deferred = { [K in keyof T]-?: T[K] extends Literal ? K : never; }[keyof T]
}
const name: Deferred = "name" // error!
export abstract class Person<T extends IPerson = IPerson> extends GraphNode<T> {
    setName(name: string) {
        (this as Person).set('name', name); // okay now
    }
}
type LiteralKeys<T> = 
  { [K in keyof T]-?: T[K] extends Literal ? K : never }[keyof T];
type EagerlyEvaluated = LiteralKeys<IPerson> // "name"
function deferred<T extends IPerson>() {
    type Deferred = LiteralKeys<T>;
    // type Deferred = { [K in keyof T]-?: T[K] extends Literal ? K : never; }[keyof T]
}
const name: Deferred = "name" // error!
export abstract class Person<T extends IPerson = IPerson> extends GraphNode<T> {
    setName(name: string) {
        (this as Person).set('name', name); // okay now
    }
}
type LiteralKeys<T> = 
  { [K in keyof T]-?: T[K] extends Literal ? K : never }[keyof T];
type EagerlyEvaluated = LiteralKeys<IPerson> // "name"
function deferred<T extends IPerson>() {
    type Deferred = LiteralKeys<T>;
    // type Deferred = { [K in keyof T]-?: T[K] extends Literal ? K : never; }[keyof T]
}
const name: Deferred = "name" // error!
export abstract class Person<T extends IPerson = IPerson> extends GraphNode<T> {
    setName(name: string) {
        (this as Person).set('name', name); // okay now
    }
}
type LiteralKeys<T> = 
  { [K in keyof T]-?: T[K] extends Literal ? K : never }[keyof T];
type EagerlyEvaluated = LiteralKeys<IPerson> // "name"
function deferred<T extends IPerson>() {
    type Deferred = LiteralKeys<T>;
    // type Deferred = { [K in keyof T]-?: T[K] extends Literal ? K : never; }[keyof T]
}
const name: Deferred = "name" // error!
export abstract class Person<T extends IPerson = IPerson> extends GraphNode<T> {
    setName(name: string) {
        (this as Person).set('name', name); // okay now
    }
}

Community Discussions

Trending Discussions on Notable
  • Referencing code outside the VS Code extension root folder
  • Milo OPC-UA Client NoSuchMethod error with io.netty.buffer.ByteBuf.writeMediumLE(int)
  • How can a simple date variable hurt my runperformance?
  • C Programming - Space Character Not Detected
  • Is there a way to set a vuetify data table page before the table is loaded?
  • Compress &amp; Upload large videos to Google cloud storage using Flutter/Dart
  • In JavaScript, I cannot successfully access a randomly-selected property from my custom object class Question using dot notation
  • Merging of two files with substitution
  • Tensorflow tf.data.Dataset.cache seems do not take the expected effect
  • FirestoreRecyclerAdapter Recyclerview UI doesn't change after adding a message to the chatroom
Trending Discussions on Notable

QUESTION

Referencing code outside the VS Code extension root folder

Asked 2022-Mar-25 at 20:39

I have a monorepo with my VS Code extension code located in the vs-code-extension folder:

.
├── website
├── vs-code-extension
└── shared

I am trying to access code from the shared folder inside the code in vs-code-extension. The extension code is mostly the default that is auto-generated when the project is initially built. The only notable change that I have made is the addition of an import statement to a file from the shared folder inside the extension.ts and the addition of a reference to the tsconfig.json file as follows:

"references": [
  { "path": "../shared" }
],

As a result, the extension starts just fine, however upon trying to execute the helloWorld command, I get the following error:

command 'my-project-name.helloWorld' not found

Upon removing the import statement to the shared folder's file in extension.ts - this error disappears.

My question here is if it is possible to reference code from outside the extension directory, and if so what would be the way to do it.

ANSWER

Answered 2022-Mar-25 at 20:39

I was able to solve this by importing the .js version of the code from the shared folder instead of .ts.

Originally, whilst the vscode intellisense and tsc build-time showed no problems with importing .ts files from the shared folder, after the extension code was compiled into .js (and placed in the out folder by default) I found that this code was still attempting to import the .ts from the shared folder which obviously caused the rest of the code to stop from working.

The returned error by vscode was too broad to identify this problem and hopefully this solution helps anyone facing similar issues when working with monorepos.

Source https://stackoverflow.com/questions/71611466

Community Discussions, Code Snippets contain sources that include Stack Exchange Network

Vulnerabilities

No vulnerabilities reported

Install Notable

You can download it from GitHub.
You can use Notable like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the Notable component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .

Support

For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

DOWNLOAD this Library from

Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases
Explore Kits

Save this library and start creating your kit

Explore Related Topics

Share this Page

share link
Consider Popular Notification Libraries
Try Top Libraries by icechen1
Compare Notification Libraries with Highest Support
Compare Notification Libraries with Highest Quality
Compare Notification Libraries with Highest Security
Compare Notification Libraries with Permissive License
Compare Notification Libraries with Highest Reuse
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases
Explore Kits

Save this library and start creating your kit

  • © 2022 Open Weaver Inc.