Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ali T. Koc is active.

Publication


Featured researches published by Ali T. Koc.


international conference on communications | 2009

Interference-Aware Energy-Efficient Power Optimization

Guowang Miao; Nageen Himayat; Geoffrey Ye Li; Ali T. Koc; Shilpa Talwar

While the demand for battery capacity on mobile devices has grown with the increase in high-bandwidth multi-media rich applications, battery technology has not kept up with this demand. Therefore power optimization techniques are becoming increasingly important in wireless system design. Power optimization schemes are also important for interference management in wireless systems as interference resulting from aggressive spectral reuse and high power transmission severely limits system performance. Although power optimization plays a pivotal role in both interference management and energy utilization, little research addresses their joint interaction. In this paper, we develop energy-efficient power optimization schemes for interference-limited communications. Both circuit and transmit powers are considered and energy efficiency is emphasized over throughput. We note that the general power optimization problem in the presence of interference is intractable even when ideal user cooperation is assumed. We first study this problem for a simple two-user network with ideal user cooperation and then develop a practical non-cooperative power optimization scheme. Simulation results show that the proposed scheme improves not only energy efficiency but also spectral efficiency in an interference-limited cellular network.


IEEE Communications Magazine | 2013

Energy impact of emerging mobile internet applications on LTE networks: issues and solutions

Maruti Gupta; Satish Chandra Jha; Ali T. Koc; Rath Vannithamby

Mobile Internet applications run on devices such as smartphones and tablets, and have dramatically changed the landscape of applicationgenerated network traffic. The potent combination of millions of such applications and the instant accessibility of high-speed Internet on mobile devices through 3G and now LTE technology has also changed how users themselves interact with the Internet. Specifically, the radio states in LTE such as RRC_Connected and RRC_Idle were designed with more traditional applications such as web browsing and FTP in mind. These traditional applications typically generated traffic only during Active (Connected) state, and once the user session ended, usually the traffic ended too, thus allowing the radio to move to Inactive (Idle) state. However, newer applications such as Facebook and Twitter generate a constant stream of autonomous and/or user generated traffic at all times, thus erasing the previously clear demarcation between Active and Inactive states. This means a given mobile device (or user equipment, in LTE parlance) often ends up moving between Connected and Idle states frequently to send mostly short bursts of data, draining device battery and causing excessive signaling overhead in LTE networks. This problem has grown and attracted the research communitys attention to address the negative effects of frequent back and forth transitions between LTE radio states. In this article, we first explore the traffic characteristics of these emerging mobile Internet applications and how they differ from more traditional applications. We investigate their impact on LTE device power and air interface signaling. We then present a survey of state-of-the-art solutions proposed in literature to address the problems, and analyze their merits and demerits. Lastly, we discuss the solutions adopted by 3GPP including the latest developments in Release 11 to handle these issues, and present potential future research directions in this field.


IEEE Transactions on Wireless Communications | 2014

Device Power Saving and Latency Optimization in LTE-A Networks Through DRX Configuration

Ali T. Koc; Satish Chandra Jha; Rath Vannithamby; Murat Torlak

Discontinuous reception (DRX) saves battery power of user equipment (UE) usually at the expense of potential increase in latency in the Long Term Evolution (LTE) networks. Therefore, an optimization is needed to find the best tradeoff between latency and power saving. In this paper, we first develop an analytical model to estimate power saving achieved and latency incurred by DRX mechanism for active and background mobile traffic. A tradeoff scheme is then formulated to maintain a balance between these two performance parameters based on operators preference for power saving and latency requirement of traffic. The analytical model is validated using system level simulation results obtained from OPNET Modeler. The results show that the proposed tradeoff scheme is efficient in keeping a balance between power saving and latency. The results also indicate that DRX short cycles are very effective in reducing latency for active traffic, while shorter inactivity timer is desirable for background traffic to enhance power saving. We also proposed a mechanism to switch DRX configuration based on traffic running at UE, using UE assistance procedure recently adopted by 3GPP in Release 11. DRX configuration switching increases the power saving significantly without any noticeable increase in latency of active traffic.


vehicular technology conference | 2012

Optimization of Discontinuous Reception (DRX) for Mobile Internet Applications over LTE

Satish Chandra Jha; Ali T. Koc; Rath Vannithamby

Discontinuous reception (DRX) brings power saving at user equipment (UE) at the cost of increased delay in 3GPP long term evolution (LTE) network. While configuring DRX parameters, a tradeoff between power saving and delay is inevitable in practice. For example, delay is crucial factor for delay sensitive applications such as online gaming, while power saving becomes the main concern for social networking applications. In this paper, we propose an algorithm to efficiently select DRX parameters to ensure a balanced tradeoff between these two conflicting performance parameters depending on applications delay requirement and UE power constraint. The proposed scheme is capable of optimizing one of these performance parameters while satisfying a specified level of guarantee for the other. Simulation results show that proposed algorithm is able to increase the power saving significantly by efficiently selecting the DRX parameters. For example, smart selection of DRX cycle for given inactivity timer increases the power saving by more than 40% depending on the delay requirement. Simulation results further show that proposed algorithm provides higher flexibility in DRX parameter selection by allowing a DRX parameter to be relaxed with the adjustment in other DRX parameter without any performance degradation.


asilomar conference on signals, systems and computers | 2012

Controlling access overload and signaling congestion in M2M networks

Umesh Phuyal; Ali T. Koc; Mo-Han Fong; Rath Vannithamby

Access overload and signaling congestion are serious issues in machine-to-machine (M2M) networks. These issues can be caused by (i) external events triggering a large number of M2M devices to connect or disconnect all at once, (ii) recurring application events that are synchronized to the exact time (e.g., hour), and (iii) malfunctioning of M2M application or server. It is vital to control access overload and signaling congestion in order to prevent the network from a complete collapse. This paper investigates a few mechanisms to prevent this issue.


global communications conference | 2014

Dual Connectivity in LTE small cell networks

Satish Chandra Jha; Kathiravetpillai Sivanesan; Rath Vannithamby; Ali T. Koc

Dual Connectivity in LTE network can significantly improve per-user throughput and mobility robustness by allowing users to be connected simultaneously to master cell group (MCG) and secondary cell group (SCG) via MeNB (master eNB) and SeNB (secondary eNB), respectively. The increase in per-user throughput is achieved by aggregating radio resources from at least two eNBs. Moreover, dual connectivity also helps in load balancing between MCG and SCG. However, it imposes several technical challenges. The main ones are buffer status report calculation and reporting, power headroom calculation and reporting, logical channel prioritization, user power saving operations such as discontinuous reception (DRX), and increased device complexity to support bearer split. The coordination between eNBs over X2 interface may be effective to resolve some of these issues. The higher delay due to non-ideal backhaul between MeNB and SeNB, however, limits an efficient coordination between these eNBs. In this paper, we explain and explore these technical challenges and investigate potential solution directions. We also provide a quantitative analysis of potential gains in terms of peruser throughput and load balancing that can be achieved by data bearer split in uplink at the cost of complex UE behaviors using a system level simulation study.


wireless communications and networking conference | 2013

Optimizing DRX configuration to improve battery power saving and latency of active mobile applications over LTE-A network

Ali T. Koc; Satish Chandra Jha; Rath Vannithamby; Murat Torlak

In LTE networks, mobile applications and their higher data rate requirements are the latest phenomenon causing a tremendous need in power saving for mobile devices. Discontinuous reception (DRX) is one of the key power saving mechanisms in LTE. Since DRX saves battery power of user equipment usually at the expense of potential increase in latency, an optimization is needed to find the best tradeoff between latency and power saving. In this paper, we first develop an analytical model to estimate power saving achieved and latency incurred by DRX operation. Then, a tradeoff scheme is formulated to maintain a balance between these two performance parameters based on operators preference for power saving. We validate analytical model using system level simulation results obtained from OPNET Modeler. The results show that the proposed tradeoff scheme is highly efficient in keeping a balance between power saving and latency. We have shown that proposed scheme can achieve a significant delay improvement with a small decrease in power saving. The results also indicate that short DRX cycles are very effective in reducing latency for active traffic.


2013 First International Black Sea Conference on Communications and Networking (BlackSeaCom) | 2013

Power Saving mechanisms for M2M communication over LTE networks

Satish Chandra Jha; Ali T. Koc; Maruti Gupta; Rath Vannithamby

Machine to Machine (M2M) communication is expected to be a major driver of growth in mobile communications for cellular networks such as LTE. Since LTE networks are primarily designed and optimized for human to human (H2H) communications, existing protocols and mechanisms are not very efficient for supporting M2M communication. Several modifications are needed to address the service requirements and traffic characteristics of M2M devices in these networks. Device power efficiency is one of the crucial requirements for M2M communication. We focus on solutions to decrease power consumption of M2M devices over 4G networks. M2M traffic characteristics are different from those of H2H traffic in terms of the size and the frequency of the generated data. One possible mechanism is to maximize the time that the M2M device is in low power state. In this paper, we evaluate the impact of extending the Paging Cycle and reducing the RRC Connected-to-Idle transition tail time on power savings. Our results show that for infrequent data transmission extending Paging Cycle reduces power consumption up to 79.3%. However, for frequent data transmission reducing Connected-to-Idle transition tail time is more effective and reduces power consumption by up to 76.2%.


international conference on communications | 2013

Adaptive DRX configuration to optimize device power saving and latency of mobile applications over LTE advanced network

Satish Chandra Jha; Ali T. Koc; Rath Vannithamby; Murat Torlak

Discontinuous reception (DRX) can be configured as a power saving mechanism for user equipment (UE) in connected mode in the long term evolution advanced (LTE-A) network. Since DRX saves UEs battery power usually at the cost of increased latency, a tradeoff between these two conflicting performance metrics are inevitable. DRX parameters can be optimized for either maximizing power saving or minimizing latency based on the applications running at the UE. In this paper, we first develop an analytical model to estimate power saving achieved and latency incurred by DRX operation. We then propose a scheme to tune DRX parameters in order to keep a balance between these conflicting performance metrics by formulating a multi-objective optimization problem. The tradeoff between power saving and latency is achieved based on the operators relative preferences for these metrics. We also validate our analytical model by system level simulation results from OPNET Modeler. OPNET simulation results are well aligned with the MATLAB results based on analytical model and the results show that proposed scheme efficiently optimizes the DRX configuration for the battery power saving and latency requirements of various diverse data applications based on the preferences determined by operator.


international conference on communications | 2014

Device power saving mechanisms for low cost MTC over LTE networks

Satish Chandra Jha; Ali T. Koc; Rath Vannithamby

3GPP has started to investigate feasibility of a low cost machine type communication (LC-MTC) terminal class in LTE networks in Release 12 aiming to be competitive with that of GSM/GPRS terminals targeting the same low-end MTC market with improved coverage. LC-MTC devices are expected to have limited power source. Since LTE technology is relatively more power hungry compared to GSM/GPRS, device power efficiency is one of the crucial requirements for LC-MTC over LTE networks. In this paper, we focus on exploring mechanisms to reduce power consumption for LC-MTC in LTE networks. Most of the current LTE power saving mechanisms are designed and optimized for human-to-human (H2H) communications which are not attractive for LC-MTC as traffic characteristics and service requirements for LC-MTC are expected to be largely different than those of H2H Communications. LC-MTC usually has very infrequent and small traffic burst mostly in UL. Therefore, one possible solution to maximize the power saving is to maximize the time with LTE radio turned off in between the traffic bursts. In this paper, we analyze the impact of turning off LTE radio without moving LC-MTC into Radio Resource Control (RRC) Idle and reducing the RRC Connected tail time (i.e., RRC Inactivity Timer). Our results show that turning LTE radio off after data transmission can achieve upto 95.28% reduction in power consumption compared to that in case of moving device to RRC Idle. Reducing RRC Connected tail time further improves the device power saving.

Collaboration


Dive into the Ali T. Koc's collaboration.

Researchain Logo
Decentralizing Knowledge