feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    Elsevier Ltd. | San Diego :Elsevier,
    UID:
    almahu_9949737382902882
    Format: 1 online resource (197 pages)
    Edition: 1st ed.
    ISBN: 0-443-27317-0
    Note: Front Cover -- Construction Methods for an Autonomous Driving Map in an Intelligent Network Environment -- Copyright Page -- Contents -- About the author -- Preface -- 1 Introduction -- 1.1 Intelligent networked environment and intelligent networked vehicles -- 1.1.1 History of intelligent connected environment -- 1.1.2 Composition and characteristics of intelligent network environment -- 1.1.3 Connotation of intelligent networked vehicles -- 1.1.4 Development status of intelligent connected vehicles -- 1.1.5 Development status of intelligent transportation system -- 1.1.6 Smart ethical norm challenges -- 1.2 Overview of target sensing technology based on road sensors -- 1.2.1 Perception technology based on in-vehicle perception devices -- 1.2.2 Perception technology based on roadside sensing devices -- 1.2.3 Development status and trend of collaborative sensing technology -- 1.2.3.1 Collaborative perception information sharing -- 1.2.3.2 Collaborative perception information fusion -- 1.3 Development of vehicle motion behavior recognition methods and prediction technology -- 1.3.1 Connotation and significance of vehicle motion behavior recognition -- 1.3.2 Development status of vehicle motion behavior recognition methods -- 1.3.3 Development status of vehicle motion behavior prediction methods -- 1.4 Overview of autonomous driving map -- 1.4.1 Concept and connotation of autonomous driving map -- 1.4.2 Composition and application of autonomous driving maps -- 1.4.2.1 Environment perception level -- 1.4.2.2 Decision planning level -- 1.4.2.3 Control execution level -- 1.4.3 Status and trends of autonomous driving map development -- 1.4.3.1 Enhancement of high-precision maps -- 1.4.3.2 Real-time update and dynamic maps -- 1.4.3.3 Multisource data fusion -- 1.4.3.4 Semantic understanding and contextual awareness -- 1.4.3.5 Cloud map service. , 1.4.3.6 Global standardization -- 1.4.4 Bottleneck problems and solutions -- 1.4.4.1 Lack of unified top-level standards and specifications -- 1.4.4.2 Lack of mature theory and technical support -- 1.4.5 Implementation path for the above problems -- 1.4.5.1 Building standards, specifications, and system frameworks related to intelligent connected environment and intellig... -- 1.4.6 Carrying out technical research on bottleneck problems -- 1.4.6.1 Strengthening multidisciplinary and cross-industry cooperation and giving full play to the advantages of industry, ... -- 1.4.6.2 Cross-industry and multiindustry collaborative research -- 2 Fusion target perception method based on vehicle vision and radar -- 2.1 Fusion theory of vehicle multisource perception information -- 2.1.1 Basis of vehicle multisource perception information fusion -- 2.1.2 Fusion method and application of vehicle multisource perception information -- 2.1.2.1 Fusion strategy based on distinguishable units -- 2.1.2.2 Fusion strategy based on complementary features -- 2.1.2.2.1 Target parameter extraction -- 2.1.2.2.2 Data feature extraction -- 2.1.2.3 Fusion strategy based on target attributes -- 2.1.2.4 Fusion strategy based on multisource decision-making -- 2.2 Overview of object perception methods based on vehicle perception equipment -- 2.2.1 Object perception method based on vehicle vision -- 2.2.1.1 Target perception -- 2.2.1.2 Vehicle perception based on vehicle vision -- 2.2.1.2.1 Competent for occasions with high real-time requirements -- 2.2.1.2.2 Problem of missed detection -- 2.2.1.3 Pedestrian perception based on vehicle vision -- 2.2.1.4 Traffic sign perception based on vehicle vision -- 2.2.2 Target perception method based on vehicle millimeter-wave radar -- 2.2.2.1 Data preprocessing -- 2.2.2.2 Effective target selection. , 2.2.3 Target perception method based on vehicle LiDAR -- 2.2.3.1 LiDAR target data preprocessing -- 2.2.3.2 LiDAR filter processing -- 2.2.3.3 Keypoint extraction of LiDAR point cloud -- 2.2.3.4 LiDAR feature extraction -- 2.3 Target perception technology integrating vehicle vision and millimeter-wave radar information -- 2.3.1 Target perception framework integrating vehicle vision and millimeter-wave radar information -- 2.3.2 Spatial-temporal fusion method of vehicle vision and vehicle radar sensing information -- 2.3.2.1 Spatial integration -- 2.3.2.1.1 Direction -- 2.3.2.1.2 Positive direction -- 2.3.2.1.3 Operation -- 2.3.2.1.4 Distance value -- 2.3.2.2 Fusion in time -- 2.3.3 Target-aware collaborative positioning technology based on vehicle-mounted multisource information fusion -- 2.3.3.1 Real-time vehicle detection and tracking based on environmental perception -- 2.3.3.1.1 Straight line extraction method based on 3D point cloud -- 2.3.3.1.2 Vehicle detection and tracking with a fusion lane model -- 2.3.4 Target perception test and analysis based on vehicle multisource information fusion -- 2.4 Chapter summary -- 3 Crossfield of view object perception method -- 3.1 Basics of crossfield target perception -- 3.1.1 Connotation and meaning -- 3.1.2 Vision-based one-way side target perception method -- 3.1.2.1 Visual object detection algorithm -- 3.1.2.2 Visual target tracking algorithm -- 3.1.2.2.1 Algorithm initialization -- 3.1.2.2.2 Target association -- 3.1.3 Singleway side target perception method based on LiDAR -- 3.2 Vehicle target association matching method based on road perception information -- 3.2.1 Overview of vehicle target association matching -- 3.2.2 Traditional target association matching algorithm -- 3.2.2.1 Hungarian algorithm -- 3.2.2.2 Nearest neighbor search algorithm. , 3.2.3 Target association matching method based on single-channel method optimization -- 3.3 Crossfield of view target re-identification method -- 3.3.1 Crossfield of view target re-identification framework -- 3.3.2 Cross-view target perception technology in vehicle-road collaborative environment -- 3.3.2.1 Virtual point cloud generation -- 3.3.2.2 Feature enhancement of point cloud data -- 3.3.2.3 D grid style attention fusion mechanism -- 3.3.2.3.1 3D fusion -- 3.3.2.3.2 Mesh fusion -- 3.3.2.4 Attention fusion -- 3.3.2.5 Loss function and overall structure -- 3.3.3 Construction of crossfield target re-identification model -- 3.3.3.1 Loss function -- 3.3.3.2 Adaptive label smoothing -- 3.3.3.3 Data augmentation -- 3.3.3.4 Warm-up learning rate -- 3.3.4 Crossfield perception experimental platform construction and test verification -- 3.3.4.1 Experimental environment and settings -- 3.3.4.1.1 Public dataset -- 3.3.4.1.2 Self-collection dataset -- 3.3.4.1.3 Evaluation indicators -- 3.3.4.1.4 Experimental setup -- 3.3.4.2 Experimental results and analysis -- 3.3.4.3 Self-collection dataset experimental test -- 4 Vehicle motion recognition method based on vehicle road fusion information -- 4.1 Theory and method of vehicle road information fusion -- 4.1.1 Concept of vehicle road information fusion -- 4.1.2 Vehicle road information fusion method and application -- 4.1.2.1 Vehicle road information fusion method system -- 4.1.2.2 Classification of vehicle information fusion methods -- 4.1.2.3 Vehicle information fusion algorithm -- 4.2 Representation and analysis of vehicle motion state -- 4.2.1 Definition and characterization of vehicle motion state -- 4.2.1.1 Vehicle motion state definition -- 4.2.1.2 Vehicle motion state representation method -- 4.2.2 Analysis and discrimination of vehicle motion state. , 4.2.2.1 Dimension reduction and analysis of feature selection data -- 4.2.2.1.1 Filtered feature selection -- 4.2.2.1.2 Wrapper feature selection -- 4.2.2.1.3 Embedded feature selection -- 4.2.2.2 Dimension reduction and analysis of bag-of-words -- 4.3 Vehicle movement behavior recognition method -- 4.3.1 Overview of vehicle motion behavior recognition -- 4.3.1.1 Vehicle motion behavior recognition uses -- 4.3.1.2 Common movement behavior identification methods -- 4.3.1.2.1 Support vector machines -- 4.3.1.2.2 Naïve Bayes classification -- 4.3.1.2.3 K-nearest neighbor -- 4.3.2 Vehicle motion behavior recognition based on implicit Dirichlet distribution model -- 4.3.2.1 Theme model -- 4.3.2.2 Dirichlet allocation model -- 4.3.2.3 Application of latent Dirichlet allocation model in vehicle behavior recognition -- 4.3.3 Optimization of vehicle motion behavior recognition method -- 4.3.3.1 Problem description -- 4.3.3.2 Hidden Dirichlet distribution model with labels -- 4.3.3.3 Model training -- 4.3.3.4 Vehicle motion behavior recognition process -- 4.3.3.5 Introduction to experiments and datasets -- 4.3.3.5.1 Experiments and datasets -- 4.3.3.5.2 Experimental comparison -- 4.4 Chapter summary -- 5 Vehicle trajectory prediction method based on improved hybrid neural network -- 5.1 Basic theory and characterization methods of vehicle trajectory prediction -- 5.1.1 Overview of vehicle trajectory prediction -- 5.1.2 Characterization method of motion state sequence of vehicles -- 5.1.2.1 Historical sequence of vehicle's motion state -- 5.1.2.2 Predicted sequence of vehicle motion state -- 5.1.2.3 Sequence of mixed traffic subjects' motion state -- 5.2 Analysis of spatiotemporal relationship of vehicle trajectory -- 5.2.1 Time domain analysis and prediction of vehicle trajectory based on long short-term memory networks -- 5.2.1.1 Input gate. , 5.2.1.2 Oblivion gate.
    Additional Edition: ISBN 0-443-27316-2
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    almahu_9949767444502882
    Format: 1 online resource (350 pages)
    Edition: 1st ed.
    ISBN: 9781529212655
    Content: In this pioneering work, expert scholars offer new thinking on proactivity by examining how emotion can drive employees' proactivity in the workplace and how, in turn, that proactivity can shape one's emotional experiences.
    Note: Front Cover -- Emotion and Proactivity at Work: Prospects and Dialogues -- Copyright information -- Table of contents -- List of Figures and Tables -- Notes on Contributors -- Acknowledgments -- Foreword -- Emotion and Proactivity at Work: Where Are We Now? -- Part I Emotion and Proactivity - Why and How It Matters -- 1 Feeling Energized to Become Proactive -- A quantitative-based review of the affect-proactivity link -- Sample and procedure -- Affect-proactive work behaviour link -- Affect-proactive person-environment fit behaviour link -- Qualitative-based review on highly relevant and frequently cited papers -- Theoretical lenses in the affect-proactivity link -- Positive and negative affect and proactivity -- Proactive work behaviours -- Proactive personal-environment fit behaviours -- Discrete emotions and proactivity -- Proactive work behaviour -- The emotional consequences of proactivity -- Emotional regulation and proactivity -- A short outlook -- Conclusion -- Future research -- Notes -- References -- 2 Igniting Initiative -- Current conceptualizations of energized-to proactive motivation -- Defining affect -- A conceptual focus on core affect -- The role of positive emotional states -- The role of negative emotional states -- The role of work engagement -- Limitations of current conceptualizations of energized-to proactive motivation -- Ways forward: Clarifying the energized-to pathway -- Focusing on discrete emotions -- How: Identifying different effects on the form or stage of proactivity -- When: Focusing on contingent factors linking negative emotions to proactivity -- Clarifying how work engagement shapes proactivity -- Engagement as more than an energized-to state -- Vigour and the energized-to pathway -- Dedication and the reason-to pathway -- Absorption and the can-do pathway -- Linking work engagement to proactivity. , Conclusion -- Implications and future directions -- Summary -- References -- Part II The Role of Emotion in Shaping Proactivity in Different Contexts -- 3 A Multilevel Model of Emotions and Proactive Behaviour -- The Five-Level Model of Emotions in the Workplace -- Level 1: Within person -- Level 2: Between-persons -- Level 3: Interpersonal relationships -- Level 4: Groups and teams -- Level 5: The organization as a whole -- Summary of the five levels -- A multilevel model of emotions and proactivity -- The dynamic nature of the FLMEW -- The interactive nature of the FLMEW -- Future research -- Conclusion -- References -- 4 Affective Events and Proactivity -- Affective events theory -- Affective events and proactive behaviour: overview of previous research -- Effects of Positive Versus Negative Work Events on Affect, Motivation, and Proactive Behaviour -- Effects of task versus interpersonal work events on affect, motivation, and proactive behaviour -- Effects of work versus non-work events on affect, motivation, and proactive behaviour -- Proactive behaviour as an affective event -- Open questions and future research -- References -- 5 Exploring Cross-Domain Relations between Emotional Energy and Proactivity -- Employee proactivity -- Non-work-domain predictors of employee proactivity -- Off-Job experiences and employee proactivity -- Sleep and employee proactivity -- Summary -- Emotional energy as a cross-domain mechanism -- Future research -- Practical implications -- Conclusion -- References -- 6 Job Insecurity and Discretionary Behaviours at Work -- Job insecurity and employee discretionary behaviours -- Job insecurity and employee discretionary behaviours: a discrete emotions perspective -- Approach-oriented and avoidance-oriented emotions: anger, frustration, fear, and shame -- Discrete emotions and discretionary behaviours at work. , Boundary conditions -- Individual approach-avoidance temperament -- Individual attribution of JI: self, organization, and environment -- Discussion -- References -- 7 Other-Praising Emotions and Employee Proactivity -- Other-praising emotions -- Gratitude and proactive prosocial behaviour -- Elevation and proactive moral behaviour -- Admiration and proactive learning behaviour -- Awe and proactive self-transcendent behaviour -- Conclusion and future research -- References -- 8 Leader's Anger and Employee Upward Voice -- Leader's emotional expression and employee proactive behaviours -- Anger in social interactions -- Two types of anger and voice -- Leader's display of task-focused anger -- Leader's display of person-focused anger -- Research journey and methodology -- Phase 1. Survey instrument development and validation -- Phase 2. The field study -- Measures -- Analysis -- Results -- Discussions -- Future research -- References -- 9 Affect and Proactivity in Teams -- Team effectiveness model -- Affective tones and proactivity -- The effects of team affective tones on team proactivity -- The effects of team behavioral processes on team affective tones -- A Note on Affect Dispersion and Diversity -- The etiology of team affective tones -- Conclusion -- References -- 10 The Dual Pathway Model of Group Affective Tone on Team Creativity -- Positive group affective tone and team creativity -- Negative group affective tone and team creativity -- A dual pathway model -- Promotion-focused pathway: The moderating roles of task complexity and the team supportive context -- Prevention-focused pathway: The moderating roles of task complexity and the team supportive context -- Theoretical extension and emerging areas in the GAT-creativity field -- How dynamics and fluctuation of group affective tone influence team creativity. , How affective diversity in group affective tone influences team creativity -- How team members' personality traits influence the effects of group affective tone on team creativity -- References -- Part III The Emotional Consequences of Proactivity -- 11 Proactivity and Well-Being -- A quantitative review and scientific mapping -- A developmental perspective/pathway -- A resource depletion perspective/pathway -- Moderators and dynamic spirals -- Individual level factors -- Situational contingencies -- Dynamic spirals -- Alternative pathways -- What to study next? -- Further unpack the proactive journey -- Examine differential mediating mechanisms -- Welcome methodological changes -- Consider contextual differences -- Conclusion -- References -- 12 Affective Consequences of Proactivity -- Affective experiences at work -- Conceptual model of affective consequences of proactive behaviour -- Effects of individual differences and change in proactive personality and behaviour -- Boundary conditions of effects of proactivity -- Psychological states resulting from changes in the self and the work environment -- The role of trait affectivity -- Between-person differences and within-person change in proactivity and affect -- Literature review -- Conceptual approaches to affective consequences of proactivity -- Empirical studies on affective consequences of proactive personality -- Empirical studies on affective consequences of proactive behaviour -- Beneficial effects of proactive behaviour -- Detrimental effects of proactive behaviour -- Dual-pathway effects of proactive behaviour -- Suggestions for future research -- Conclusion -- References -- Conclusions and Future Directions -- An integrative review -- Four research avenues -- References -- Index -- Back Cover.
    Additional Edition: Print version: Parker, Sharon Emotion and Proactivity at Work Bristol : Bristol University Press,c2021 ISBN 9781529208306
    Language: English
    Keywords: Electronic books.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    Elsevier Ltd. | San Diego :Elsevier,
    UID:
    edocfu_9961520228602883
    Format: 1 online resource (197 pages)
    Edition: 1st ed.
    ISBN: 0-443-27317-0
    Note: Front Cover -- Construction Methods for an Autonomous Driving Map in an Intelligent Network Environment -- Copyright Page -- Contents -- About the author -- Preface -- 1 Introduction -- 1.1 Intelligent networked environment and intelligent networked vehicles -- 1.1.1 History of intelligent connected environment -- 1.1.2 Composition and characteristics of intelligent network environment -- 1.1.3 Connotation of intelligent networked vehicles -- 1.1.4 Development status of intelligent connected vehicles -- 1.1.5 Development status of intelligent transportation system -- 1.1.6 Smart ethical norm challenges -- 1.2 Overview of target sensing technology based on road sensors -- 1.2.1 Perception technology based on in-vehicle perception devices -- 1.2.2 Perception technology based on roadside sensing devices -- 1.2.3 Development status and trend of collaborative sensing technology -- 1.2.3.1 Collaborative perception information sharing -- 1.2.3.2 Collaborative perception information fusion -- 1.3 Development of vehicle motion behavior recognition methods and prediction technology -- 1.3.1 Connotation and significance of vehicle motion behavior recognition -- 1.3.2 Development status of vehicle motion behavior recognition methods -- 1.3.3 Development status of vehicle motion behavior prediction methods -- 1.4 Overview of autonomous driving map -- 1.4.1 Concept and connotation of autonomous driving map -- 1.4.2 Composition and application of autonomous driving maps -- 1.4.2.1 Environment perception level -- 1.4.2.2 Decision planning level -- 1.4.2.3 Control execution level -- 1.4.3 Status and trends of autonomous driving map development -- 1.4.3.1 Enhancement of high-precision maps -- 1.4.3.2 Real-time update and dynamic maps -- 1.4.3.3 Multisource data fusion -- 1.4.3.4 Semantic understanding and contextual awareness -- 1.4.3.5 Cloud map service. , 1.4.3.6 Global standardization -- 1.4.4 Bottleneck problems and solutions -- 1.4.4.1 Lack of unified top-level standards and specifications -- 1.4.4.2 Lack of mature theory and technical support -- 1.4.5 Implementation path for the above problems -- 1.4.5.1 Building standards, specifications, and system frameworks related to intelligent connected environment and intellig... -- 1.4.6 Carrying out technical research on bottleneck problems -- 1.4.6.1 Strengthening multidisciplinary and cross-industry cooperation and giving full play to the advantages of industry, ... -- 1.4.6.2 Cross-industry and multiindustry collaborative research -- 2 Fusion target perception method based on vehicle vision and radar -- 2.1 Fusion theory of vehicle multisource perception information -- 2.1.1 Basis of vehicle multisource perception information fusion -- 2.1.2 Fusion method and application of vehicle multisource perception information -- 2.1.2.1 Fusion strategy based on distinguishable units -- 2.1.2.2 Fusion strategy based on complementary features -- 2.1.2.2.1 Target parameter extraction -- 2.1.2.2.2 Data feature extraction -- 2.1.2.3 Fusion strategy based on target attributes -- 2.1.2.4 Fusion strategy based on multisource decision-making -- 2.2 Overview of object perception methods based on vehicle perception equipment -- 2.2.1 Object perception method based on vehicle vision -- 2.2.1.1 Target perception -- 2.2.1.2 Vehicle perception based on vehicle vision -- 2.2.1.2.1 Competent for occasions with high real-time requirements -- 2.2.1.2.2 Problem of missed detection -- 2.2.1.3 Pedestrian perception based on vehicle vision -- 2.2.1.4 Traffic sign perception based on vehicle vision -- 2.2.2 Target perception method based on vehicle millimeter-wave radar -- 2.2.2.1 Data preprocessing -- 2.2.2.2 Effective target selection. , 2.2.3 Target perception method based on vehicle LiDAR -- 2.2.3.1 LiDAR target data preprocessing -- 2.2.3.2 LiDAR filter processing -- 2.2.3.3 Keypoint extraction of LiDAR point cloud -- 2.2.3.4 LiDAR feature extraction -- 2.3 Target perception technology integrating vehicle vision and millimeter-wave radar information -- 2.3.1 Target perception framework integrating vehicle vision and millimeter-wave radar information -- 2.3.2 Spatial-temporal fusion method of vehicle vision and vehicle radar sensing information -- 2.3.2.1 Spatial integration -- 2.3.2.1.1 Direction -- 2.3.2.1.2 Positive direction -- 2.3.2.1.3 Operation -- 2.3.2.1.4 Distance value -- 2.3.2.2 Fusion in time -- 2.3.3 Target-aware collaborative positioning technology based on vehicle-mounted multisource information fusion -- 2.3.3.1 Real-time vehicle detection and tracking based on environmental perception -- 2.3.3.1.1 Straight line extraction method based on 3D point cloud -- 2.3.3.1.2 Vehicle detection and tracking with a fusion lane model -- 2.3.4 Target perception test and analysis based on vehicle multisource information fusion -- 2.4 Chapter summary -- 3 Crossfield of view object perception method -- 3.1 Basics of crossfield target perception -- 3.1.1 Connotation and meaning -- 3.1.2 Vision-based one-way side target perception method -- 3.1.2.1 Visual object detection algorithm -- 3.1.2.2 Visual target tracking algorithm -- 3.1.2.2.1 Algorithm initialization -- 3.1.2.2.2 Target association -- 3.1.3 Singleway side target perception method based on LiDAR -- 3.2 Vehicle target association matching method based on road perception information -- 3.2.1 Overview of vehicle target association matching -- 3.2.2 Traditional target association matching algorithm -- 3.2.2.1 Hungarian algorithm -- 3.2.2.2 Nearest neighbor search algorithm. , 3.2.3 Target association matching method based on single-channel method optimization -- 3.3 Crossfield of view target re-identification method -- 3.3.1 Crossfield of view target re-identification framework -- 3.3.2 Cross-view target perception technology in vehicle-road collaborative environment -- 3.3.2.1 Virtual point cloud generation -- 3.3.2.2 Feature enhancement of point cloud data -- 3.3.2.3 D grid style attention fusion mechanism -- 3.3.2.3.1 3D fusion -- 3.3.2.3.2 Mesh fusion -- 3.3.2.4 Attention fusion -- 3.3.2.5 Loss function and overall structure -- 3.3.3 Construction of crossfield target re-identification model -- 3.3.3.1 Loss function -- 3.3.3.2 Adaptive label smoothing -- 3.3.3.3 Data augmentation -- 3.3.3.4 Warm-up learning rate -- 3.3.4 Crossfield perception experimental platform construction and test verification -- 3.3.4.1 Experimental environment and settings -- 3.3.4.1.1 Public dataset -- 3.3.4.1.2 Self-collection dataset -- 3.3.4.1.3 Evaluation indicators -- 3.3.4.1.4 Experimental setup -- 3.3.4.2 Experimental results and analysis -- 3.3.4.3 Self-collection dataset experimental test -- 4 Vehicle motion recognition method based on vehicle road fusion information -- 4.1 Theory and method of vehicle road information fusion -- 4.1.1 Concept of vehicle road information fusion -- 4.1.2 Vehicle road information fusion method and application -- 4.1.2.1 Vehicle road information fusion method system -- 4.1.2.2 Classification of vehicle information fusion methods -- 4.1.2.3 Vehicle information fusion algorithm -- 4.2 Representation and analysis of vehicle motion state -- 4.2.1 Definition and characterization of vehicle motion state -- 4.2.1.1 Vehicle motion state definition -- 4.2.1.2 Vehicle motion state representation method -- 4.2.2 Analysis and discrimination of vehicle motion state. , 4.2.2.1 Dimension reduction and analysis of feature selection data -- 4.2.2.1.1 Filtered feature selection -- 4.2.2.1.2 Wrapper feature selection -- 4.2.2.1.3 Embedded feature selection -- 4.2.2.2 Dimension reduction and analysis of bag-of-words -- 4.3 Vehicle movement behavior recognition method -- 4.3.1 Overview of vehicle motion behavior recognition -- 4.3.1.1 Vehicle motion behavior recognition uses -- 4.3.1.2 Common movement behavior identification methods -- 4.3.1.2.1 Support vector machines -- 4.3.1.2.2 Naïve Bayes classification -- 4.3.1.2.3 K-nearest neighbor -- 4.3.2 Vehicle motion behavior recognition based on implicit Dirichlet distribution model -- 4.3.2.1 Theme model -- 4.3.2.2 Dirichlet allocation model -- 4.3.2.3 Application of latent Dirichlet allocation model in vehicle behavior recognition -- 4.3.3 Optimization of vehicle motion behavior recognition method -- 4.3.3.1 Problem description -- 4.3.3.2 Hidden Dirichlet distribution model with labels -- 4.3.3.3 Model training -- 4.3.3.4 Vehicle motion behavior recognition process -- 4.3.3.5 Introduction to experiments and datasets -- 4.3.3.5.1 Experiments and datasets -- 4.3.3.5.2 Experimental comparison -- 4.4 Chapter summary -- 5 Vehicle trajectory prediction method based on improved hybrid neural network -- 5.1 Basic theory and characterization methods of vehicle trajectory prediction -- 5.1.1 Overview of vehicle trajectory prediction -- 5.1.2 Characterization method of motion state sequence of vehicles -- 5.1.2.1 Historical sequence of vehicle's motion state -- 5.1.2.2 Predicted sequence of vehicle motion state -- 5.1.2.3 Sequence of mixed traffic subjects' motion state -- 5.2 Analysis of spatiotemporal relationship of vehicle trajectory -- 5.2.1 Time domain analysis and prediction of vehicle trajectory based on long short-term memory networks -- 5.2.1.1 Input gate. , 5.2.1.2 Oblivion gate.
    Additional Edition: ISBN 0-443-27316-2
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Online Resource
    Online Resource
    Elsevier Ltd. | San Diego :Elsevier,
    UID:
    edoccha_9961520228602883
    Format: 1 online resource (197 pages)
    Edition: 1st ed.
    ISBN: 0-443-27317-0
    Note: Front Cover -- Construction Methods for an Autonomous Driving Map in an Intelligent Network Environment -- Copyright Page -- Contents -- About the author -- Preface -- 1 Introduction -- 1.1 Intelligent networked environment and intelligent networked vehicles -- 1.1.1 History of intelligent connected environment -- 1.1.2 Composition and characteristics of intelligent network environment -- 1.1.3 Connotation of intelligent networked vehicles -- 1.1.4 Development status of intelligent connected vehicles -- 1.1.5 Development status of intelligent transportation system -- 1.1.6 Smart ethical norm challenges -- 1.2 Overview of target sensing technology based on road sensors -- 1.2.1 Perception technology based on in-vehicle perception devices -- 1.2.2 Perception technology based on roadside sensing devices -- 1.2.3 Development status and trend of collaborative sensing technology -- 1.2.3.1 Collaborative perception information sharing -- 1.2.3.2 Collaborative perception information fusion -- 1.3 Development of vehicle motion behavior recognition methods and prediction technology -- 1.3.1 Connotation and significance of vehicle motion behavior recognition -- 1.3.2 Development status of vehicle motion behavior recognition methods -- 1.3.3 Development status of vehicle motion behavior prediction methods -- 1.4 Overview of autonomous driving map -- 1.4.1 Concept and connotation of autonomous driving map -- 1.4.2 Composition and application of autonomous driving maps -- 1.4.2.1 Environment perception level -- 1.4.2.2 Decision planning level -- 1.4.2.3 Control execution level -- 1.4.3 Status and trends of autonomous driving map development -- 1.4.3.1 Enhancement of high-precision maps -- 1.4.3.2 Real-time update and dynamic maps -- 1.4.3.3 Multisource data fusion -- 1.4.3.4 Semantic understanding and contextual awareness -- 1.4.3.5 Cloud map service. , 1.4.3.6 Global standardization -- 1.4.4 Bottleneck problems and solutions -- 1.4.4.1 Lack of unified top-level standards and specifications -- 1.4.4.2 Lack of mature theory and technical support -- 1.4.5 Implementation path for the above problems -- 1.4.5.1 Building standards, specifications, and system frameworks related to intelligent connected environment and intellig... -- 1.4.6 Carrying out technical research on bottleneck problems -- 1.4.6.1 Strengthening multidisciplinary and cross-industry cooperation and giving full play to the advantages of industry, ... -- 1.4.6.2 Cross-industry and multiindustry collaborative research -- 2 Fusion target perception method based on vehicle vision and radar -- 2.1 Fusion theory of vehicle multisource perception information -- 2.1.1 Basis of vehicle multisource perception information fusion -- 2.1.2 Fusion method and application of vehicle multisource perception information -- 2.1.2.1 Fusion strategy based on distinguishable units -- 2.1.2.2 Fusion strategy based on complementary features -- 2.1.2.2.1 Target parameter extraction -- 2.1.2.2.2 Data feature extraction -- 2.1.2.3 Fusion strategy based on target attributes -- 2.1.2.4 Fusion strategy based on multisource decision-making -- 2.2 Overview of object perception methods based on vehicle perception equipment -- 2.2.1 Object perception method based on vehicle vision -- 2.2.1.1 Target perception -- 2.2.1.2 Vehicle perception based on vehicle vision -- 2.2.1.2.1 Competent for occasions with high real-time requirements -- 2.2.1.2.2 Problem of missed detection -- 2.2.1.3 Pedestrian perception based on vehicle vision -- 2.2.1.4 Traffic sign perception based on vehicle vision -- 2.2.2 Target perception method based on vehicle millimeter-wave radar -- 2.2.2.1 Data preprocessing -- 2.2.2.2 Effective target selection. , 2.2.3 Target perception method based on vehicle LiDAR -- 2.2.3.1 LiDAR target data preprocessing -- 2.2.3.2 LiDAR filter processing -- 2.2.3.3 Keypoint extraction of LiDAR point cloud -- 2.2.3.4 LiDAR feature extraction -- 2.3 Target perception technology integrating vehicle vision and millimeter-wave radar information -- 2.3.1 Target perception framework integrating vehicle vision and millimeter-wave radar information -- 2.3.2 Spatial-temporal fusion method of vehicle vision and vehicle radar sensing information -- 2.3.2.1 Spatial integration -- 2.3.2.1.1 Direction -- 2.3.2.1.2 Positive direction -- 2.3.2.1.3 Operation -- 2.3.2.1.4 Distance value -- 2.3.2.2 Fusion in time -- 2.3.3 Target-aware collaborative positioning technology based on vehicle-mounted multisource information fusion -- 2.3.3.1 Real-time vehicle detection and tracking based on environmental perception -- 2.3.3.1.1 Straight line extraction method based on 3D point cloud -- 2.3.3.1.2 Vehicle detection and tracking with a fusion lane model -- 2.3.4 Target perception test and analysis based on vehicle multisource information fusion -- 2.4 Chapter summary -- 3 Crossfield of view object perception method -- 3.1 Basics of crossfield target perception -- 3.1.1 Connotation and meaning -- 3.1.2 Vision-based one-way side target perception method -- 3.1.2.1 Visual object detection algorithm -- 3.1.2.2 Visual target tracking algorithm -- 3.1.2.2.1 Algorithm initialization -- 3.1.2.2.2 Target association -- 3.1.3 Singleway side target perception method based on LiDAR -- 3.2 Vehicle target association matching method based on road perception information -- 3.2.1 Overview of vehicle target association matching -- 3.2.2 Traditional target association matching algorithm -- 3.2.2.1 Hungarian algorithm -- 3.2.2.2 Nearest neighbor search algorithm. , 3.2.3 Target association matching method based on single-channel method optimization -- 3.3 Crossfield of view target re-identification method -- 3.3.1 Crossfield of view target re-identification framework -- 3.3.2 Cross-view target perception technology in vehicle-road collaborative environment -- 3.3.2.1 Virtual point cloud generation -- 3.3.2.2 Feature enhancement of point cloud data -- 3.3.2.3 D grid style attention fusion mechanism -- 3.3.2.3.1 3D fusion -- 3.3.2.3.2 Mesh fusion -- 3.3.2.4 Attention fusion -- 3.3.2.5 Loss function and overall structure -- 3.3.3 Construction of crossfield target re-identification model -- 3.3.3.1 Loss function -- 3.3.3.2 Adaptive label smoothing -- 3.3.3.3 Data augmentation -- 3.3.3.4 Warm-up learning rate -- 3.3.4 Crossfield perception experimental platform construction and test verification -- 3.3.4.1 Experimental environment and settings -- 3.3.4.1.1 Public dataset -- 3.3.4.1.2 Self-collection dataset -- 3.3.4.1.3 Evaluation indicators -- 3.3.4.1.4 Experimental setup -- 3.3.4.2 Experimental results and analysis -- 3.3.4.3 Self-collection dataset experimental test -- 4 Vehicle motion recognition method based on vehicle road fusion information -- 4.1 Theory and method of vehicle road information fusion -- 4.1.1 Concept of vehicle road information fusion -- 4.1.2 Vehicle road information fusion method and application -- 4.1.2.1 Vehicle road information fusion method system -- 4.1.2.2 Classification of vehicle information fusion methods -- 4.1.2.3 Vehicle information fusion algorithm -- 4.2 Representation and analysis of vehicle motion state -- 4.2.1 Definition and characterization of vehicle motion state -- 4.2.1.1 Vehicle motion state definition -- 4.2.1.2 Vehicle motion state representation method -- 4.2.2 Analysis and discrimination of vehicle motion state. , 4.2.2.1 Dimension reduction and analysis of feature selection data -- 4.2.2.1.1 Filtered feature selection -- 4.2.2.1.2 Wrapper feature selection -- 4.2.2.1.3 Embedded feature selection -- 4.2.2.2 Dimension reduction and analysis of bag-of-words -- 4.3 Vehicle movement behavior recognition method -- 4.3.1 Overview of vehicle motion behavior recognition -- 4.3.1.1 Vehicle motion behavior recognition uses -- 4.3.1.2 Common movement behavior identification methods -- 4.3.1.2.1 Support vector machines -- 4.3.1.2.2 Naïve Bayes classification -- 4.3.1.2.3 K-nearest neighbor -- 4.3.2 Vehicle motion behavior recognition based on implicit Dirichlet distribution model -- 4.3.2.1 Theme model -- 4.3.2.2 Dirichlet allocation model -- 4.3.2.3 Application of latent Dirichlet allocation model in vehicle behavior recognition -- 4.3.3 Optimization of vehicle motion behavior recognition method -- 4.3.3.1 Problem description -- 4.3.3.2 Hidden Dirichlet distribution model with labels -- 4.3.3.3 Model training -- 4.3.3.4 Vehicle motion behavior recognition process -- 4.3.3.5 Introduction to experiments and datasets -- 4.3.3.5.1 Experiments and datasets -- 4.3.3.5.2 Experimental comparison -- 4.4 Chapter summary -- 5 Vehicle trajectory prediction method based on improved hybrid neural network -- 5.1 Basic theory and characterization methods of vehicle trajectory prediction -- 5.1.1 Overview of vehicle trajectory prediction -- 5.1.2 Characterization method of motion state sequence of vehicles -- 5.1.2.1 Historical sequence of vehicle's motion state -- 5.1.2.2 Predicted sequence of vehicle motion state -- 5.1.2.3 Sequence of mixed traffic subjects' motion state -- 5.2 Analysis of spatiotemporal relationship of vehicle trajectory -- 5.2.1 Time domain analysis and prediction of vehicle trajectory based on long short-term memory networks -- 5.2.1.1 Input gate. , 5.2.1.2 Oblivion gate.
    Additional Edition: ISBN 0-443-27316-2
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    UID:
    gbv_1879912252
    Format: 2, 288 Seiten , 图
    Edition: 第1版
    Original writing title: 当代中国的司法改革研究
    Original writing person/organisation: 陈志军
    Original writing publisher: 哈尔滨 : 哈尔滨工程大学出版社
    ISBN: 9787566115065
    Note: 附参考文献
    Language: Chinese
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages