ECE Research Seminars

The ECE research seminars focus on the topics in information theory, signal processing, machine learning, blockchain, control and optimization, etc. If you are interested in giving a talk, please let me know (jinyuan@latech.edu). Currently the talks will be delivered over zoom due to the covid-19.
The talks can be connected from this zoom link.



  • Date: April 29, 2021, at 3:30pm,         Jinyuan Chen (Louisiana Tech University)

    Title: Common-Randomness Aided Secure Communication

    Abstract: In many communication networks, secrecy constraints usually incur a penalty in capacity or generalized degrees-of-freedom (GDoF, a form of capacity approximation). In this talk, we will show that this penalty can be nullified by providing common randomness (CR) to transmitting users, but not to the receivers. Our work is motivated by the cellular network, where the base stations can share CR with high-throughput backhaul cable. In this scenario, sharing CR between base stations (transmitters) is more practical than sharing CR between base station (transmitter) and mobile user (receiver).

    Bio: Dr. Jinyuan Chen is an Assistant Professor at Electrical Engineering Department, Louisiana Tech University. Before joining Louisiana Tech, he was a Postdoctoral Scholar at Stanford University from 2014 to 2016. He received the PhD degree from Telecom ParisTech in 2014, the M.Sc. degree from Beijing University of Posts and Telecommunications in 2010, and the B.Sc. degree from Tianjin University in 2007. His research interests include information theory, distributed consensus, blockchain, and machine learning.



  • Date: May 11, 2021, at 3:30pm,                 Jihun Hamm (Tulane University)

    Title: Vulnerability of Provably-Robust Machine Learning Methods to Poisoning Attacks

    Abstract: Predictions of certifiably robust classifiers remain constant in a neighborhood of a point, making them resilient to test-time attacks with a guarantee. In this work, we present a previously unrecognized threat to robust machine learning models that highlights the importance of training-data quality in achieving high certified adversarial robustness. Specifically, we propose a novel bilevel optimization based data poisoning attack that degrades the robustness guarantees of certifiably robust classifiers. Unlike other poisoning attacks that reduce the accuracy of the poisoned models on a small set of target points, our attack reduces the average certified radius (ACR) of an entire target class in the dataset. Moreover, our attack is effective even when the victim trains the models from scratch using state-of-the-art robust training methods such as Gaussian data augmentation, MACER, and SmoothAdv that achieve high certified adversarial robustness. To make the attack harder to detect, we use clean-label poisoning points with imperceptible distortions. The effectiveness of the proposed method is evaluated by poisoning MNIST and CIFAR10 datasets and training deep neural networks using previously mentioned training methods and certifying the robustness with randomized smoothing. The ACR of the target class, for models trained on generated poison data, can be reduced by more than 30%. Moreover, the poisoned data is transferable to models trained with different training methods and models with different architectures.

    Bio: Dr. Hamm is an Associate Professor of Computer Science at Tulane University since 2019. He received his PhD degree from the University of Pennsylvania in 2008 supervised by Dr. Daniel Lee. Dr. Hamm's research interest is in machine learning, from theory and to applications. He has worked on efficient algorithms for adversarial machine learning, deep learning, privacy and security, optimization, and nonlinear dimensionality reduction. Dr. Hamm also has a background in biomedical engineering and has worked on medical data analysis, computational anatomy and on modeling human behaviors. His approach can be summarized as using machine learning to find novel solutions for challenging problems in the applied fields. His work in machine learning has been published in top venues such as ICML, NIPS, CVPR, JMLR, and IEEE-TPAMI. His work has also been published in medical research venues such as MICCAI, MedIA, and IEEE-TMI. The academic community has recognized his contributions; among other awards, he has earned the Best Paper Award from MedIA (2010) and Google Faculty Research Award (2015).



  • Date: May 18, 2021, at 3:30pm,                 Xiyuan Liu (Louisiana Tech University)

    Title: Introduction to Machine Learning in Classification and Clustering Analysis

    Abstract: Machine learning plays an essential rule in today's era of big data analysis. One of the most important research problems in big data analysis is classifying observations into different groups. There are two approaches to this problem: clustering and classification. The clustering analysis is unsupervised learning, such as Principal Components Analysis, K-Means Clustering, and Hierarchical Clustering. On the other hand, classification is supervised learning, such as Logistic Regression, Linear Discriminant Analysis, and Conditional Random Field. This presentation will briefly introduce some classical approaches and closely introduce Conditional Random Field.

    Bio: Dr. Xiyuan Liu is an Assistant Professor in the Department of Mathematics & Statistics, Louisiana Tech University. His research interests include classification, high-dimensional linear regression, and non-parametric regression.



  • Date: May 20, 2021, at 3:30pm,                 Haifeng Wang (Mississippi State University)

    Title: On Deep Learning Architecture Optimization for Medical Image Analysis

    Abstract: Deep neural network demonstrates remarkable generalization performance in many machine learning tasks. However, there is still no such a universal model that can always obtain the best performance for all problems, a.k.a., no free lunch theorem. Deep neural architecture design is a time-consuming and challenging task, which is even challenging for medical image analysis due to the limited data size and higher variation of scanning modalities. This talk discusses an adaptive deep learning model for disease classification in 3D medical image analysis. A novel objective function is proposed to optimize both the model convergence trend and the accuracy in the learning process. The adaptive deep learning model can automatically identify the most effective and efficient model structure based on the given architecture search space. The experimental results are conducted to validate the performance of the proposed model and demonstrated better performance over many existing approaches.

    Bio: Haifeng Wang is an Assistant Professor in the Industrial and Systems Engineering Department at Mississippi State University (MSU). He received his Ph.D. and M.S. in Industrial and Systems Engineering from SUNY-Binghamton in 2019 and 2015, respectively. Dr. Wang’s research interests focus on the decision-making and intelligent control of complex engineering systems through machine learning, ensemble learning, neural network optimization, and real-time optimization, etc. The application area includes computer-aided diagnosis, warehouse management, robotics, smart manufacturing, etc. His publications have appeared in journals such as EJOR, Neurocomputing, RCIM, IISE Transactions on Healthcare Systems Engineering, and Expert Systems with Applications. Dr. Wang has served as a reviewer for different scientific journals, such as Computers & Operations Research (COR), Information Systems, Cancers, Journal of Imaging, and IEEE Transactions on Neural Networks and Learning Systems (TNNLS). He is a member of the Institute of Industrial and Systems Engineers (IISE), the Institute for Operations Research and the Management Sciences (INFORMS), and the Medical Image Computing and Computer-Assisted Intervention (MICCAI) society. He is the Chair of the Healthcare Systems Research Working group at MSU, and the Academic Committee of Society for Health Systems (SHS).



  • Date: September 30, 2021, at 4:00pm,         Jinyuan Chen (Louisiana Tech University)

    Title: Coding for Byzantine Agreement

    Abstract: Byzantine agreement and its variants are considered to be the fundamental building blocks for distributed systems and cryptography, while coding theory is a powerful tool for error detection and correction. In this talk we will show how to solve the Byzantine agreement problem by using tools from coding theory, together with graph theory and linear algebra.

    Bio: Dr. Jinyuan Chen is an Assistant Professor at Electrical Engineering Department, Louisiana Tech University. Before joining Louisiana Tech, he was a Postdoctoral Scholar at Stanford University from 2014 to 2016. He received the PhD degree from Telecom ParisTech in 2014, the M.Sc. degree from Beijing University of Posts and Telecommunications in 2010, and the B.Sc. degree from Tianjin University in 2007. His research interests include information theory, distributed consensus, blockchain, and machine learning.



  • Date: January 25, 2022, at 2:00pm,         Nathan Green (Louisiana Tech University)

    Title: Transcendence in Function Fields

    Abstract: In this talk we will give a basic overview of our research field, suitable to a student with a strong engineering-math background. We will discuss applications of Drinfeld modules to proving transcendence results for zeta values and multiple zeta values over function fields. Over the complex numbers, special values of the Riemann zeta function are some of the most studied objects in number theory, and yet we know very little about their transcendence. For odd positive values, we know that ζ(3) is irrational, but that's about it! Over function fields we have numerous, powerful tools available from geometry that allow us to prove many more sophisticated results. This includes, for example, the transcendence and algebraic independence of function field zeta values for the rational function field. We will first review some of the basic theory and history of transcendental number theory, after which we will discuss how the algebraic techniques of Drinfeld modules and t-motives allow us to attack these problems. Finally, we will discuss our recent results in this direction, including a proof of the algebraic independence of special zeta values associated to higher genus curves and theorems laying the groundwork for proving the algebraic independence of multiple zeta values.

    Bio: Dr. Green is an Assistant Professor of mathematics at Louisiana Tech. His research area is in algebraic number theory, where he studies transcendence questions related to special values of zeta functions over function fields. He graduated with his PhD in math from Texas A&M University in 2018 and worked as a visiting assistant professor at UC San Diego from 2018-2021. He received his bachelors and masters degrees from Brigham Young University in Utah. His research pushes the boundaries of his field and he has been invited as a main speaker at several international conferences in his research area. He has published in many of the leading journals in mathematics.



  • Date: April 25, 2022, at 2:00pm,         Natarajan Meghanathan (Jackson State University)

    Title: Neighborhood-based Bridge Node Centrality Tuple for Complex Network Analysis

    Abstract: We define a bridge node to be a node whose neighbor nodes are sparsely connected to each other and are likely to be part of different components if the node is removed from the network. We propose a computationally light Neighborhood-based Bridge Node Centrality (NBNC) tuple that could be used to identify the bridge nodes of a network as well as rank the nodes in a network on the basis of their topological position to function as bridge nodes. The NBNC tuple for a node is asynchronously computed on the basis of the neighborhood graph of the node that comprises the neighbors of the node as vertices and the links connecting the neighbors as edges. The NBNC tuple for a node has three entries: the number of components in the neighborhood graph of the node, the algebraic connectivity ratio of the neighborhood graph of the node and the number of neighbors of the node. We analyze a suite of 60 complex real-world networks and evaluate the computational lightness, effectiveness, efficiency/accuracy and uniqueness of the NBNC tuple vis-a-vis the existing bridgeness related centrality metrics and the Louvain community detection algorithm.

    Bio: Dr. Natarajan Meghanathan is a tenured Full Professor of Computer Science at Jackson State University, Jackson, MS. He graduated with a Ph.D. in Computer Science from The University of Texas at Dallas in May 2005. Dr. Meghanathan has published more than 150 peer-reviewed articles (more than half of them being journal publications). He has also received federal education and research grants from the U. S. National Science Foundation, Army Research Lab, Air Force Research Lab, NASA and NIH. Dr. Meghanathan has been serving in the editorial board of several international journals and in the Technical Program Committees and Organization Committees of several international conferences. His research interests are Graph Theory and Network Science, Wireless Ad hoc Networks and Sensor Networks, Cyber Security and Machine Learning.



  • Date: September 29, 2022, at 2:00pm,                 Fan Li (Louisiana Tech University)

    Title: Fundamental Limits and Insights: From Secure Communication to Distributed Computing (PhD Dissertation Defense)

    Abstract: This dissertation focuses on solving some fundamental problems in secure communication and distributed computing, by using tools from information theory and coding theory. The first direction focuses on secure communication, which is motivated by the ever-growing amount of sensitive data communicated over wireless networks. In many communication networks, secrecy constraints usually incur a penalty in capacity. In our work, we focus on how to remove the secrecy penalty in communication networks in some scenarios, with the aid of common randomness or helpers. The second direction focuses on distributed computing, which has come to prominence within industrial sectors with the availability of low-cost servers and big data. In our work, we characterize the fundamental tradeoff between computation load and communication load in some MapReduce distributed computing systems. Our works provide some fundamental insights in secure communication and distributed computing, with potential applications in 6G, cybersecurity, edge computing, distributed machine learning, etc.

    Bio: Fan Li is currently pursuing the Ph.D. degree with the Electrical Engineering Department of Louisiana Tech University. She received the B.Sc. degree from Shandong Technology and Business University in 2012, and the M.Sc. degree from University of Shanghai for Science and Technology in 2015. Her research interests include information theory, distributed computing, blockchains, and machine learning.