What is the difference between ics 4.0.2 and 4.0.4




















Through third-party apps, users can connect to compatible devices to take advantage of new features such as instant sharing of files, photos, or other media; streaming video or audio from another device; or connecting to compatible printers or other devices.

With support from third-party apps, users can connect to wireless medical devices and sensors in hospitals, fitness centers, homes, and elsewhere. It includes all of the familiar Android 3. For developers, the unified UI framework in Android 4.

A shared social provider and API provide a new unified store for contacts, profile data, stream items, and photos. Any app or social network with user permission can contribute raw contacts and make them accessible to other apps and networks.

Applications with user permission can also read profile data from the provider and display it in their applications. The social API lets applications store standard contact data as well as new types of content for any given contact, including large profile photos, stream items, and recent activity feedback. The social provider uses the recent activity feedback as a new signal in ranking, such as for name auto-complete, to keep the most relevant contacts ranked closest to the top.

Applications can also let users set up a social connection to a contact from the People app. When the user touches Add Connection in a contact, the app sends a public intent that other apps can handle, displaying any UI needed to create the social connection. Building on the social API, developers can add powerful new interactions that span multiple social networks and contacts sources. A shared calendar content provider and framework API make it easier for developers to add calendar services to their apps.

With user permission, any application can add events to the shared database and manage dates, attendees, alerts, and reminders. Applications can also read entries from the database, including events contributed by other applications, and handle the display of event alerts and reminders.

Apps can also use calendar data to improve the relevance of their other content. For lighter-weight access to calendar services, the Calendar app defines a set of public Intents for creating, viewing, and editing events. Rather than needing to implement a calendar UI and integrate directly with the calendar provider, applications can simply broadcast calendar Intents.

When the Calendar app receives the Intents, it launches the appropriate UI and stores any event data entered. Using calendar Intents, for example, apps can let users add events directly from lists, dialogs, or home screen widgets, such as for making restaurant reservations or booking time with friends. A shared Voicemail provider and API allow developers to build applications that contribute to a unified voicemail store. Android Beam is an NFC-based feature that lets users instantly share information about the apps they are using, just by touching two NFC-enabled phones together.

When the devices are in range — within a few centimeters — the system sets up an NFC connection and displays a sharing UI. To share whatever they are viewing with the other device, users just touch the screen.

For developers, Android Beam is a new way of triggering almost any type of proximity-based interaction. For example, it can let users instantly exchange contacts, set up multiplayer gaming, join a chat or video call, share a photo or video, and more. The system provides the low-level NFC support and the sharing UI, while the foreground app provides lightweight data to transfer to the other device.

Developers have complete control over the data that is shared and how it is handled, so almost any interaction is possible. For larger payloads, developers can even use Android Beam to initiate a connection and transfer the data over Bluetooth, without the need for user-visible pairing.

Even if developers do not add custom interactions based on Android Beam they can still benefit from it being deeply integrated into Android. The UI framework includes a new widget, ShareActionProvider, that lets developers quickly embed standard share functionality and UI in the Action Bar of their applications.

Developers simply add ShareActionProvider to the menu and set an intent that describes the desired sharing action. The system handles the rest, building up the list of applications that can handle the share intent and dispatching the intent when the user chooses from the menu. The new path is ideal for applications that need to maintain complete control over media data before passing it to the platform for presentation.

The platform de-muxes, decodes, and renders the content. The audio track is rendered to the active audio device, while the video track is rendered to either a Surface or a SurfaceTexture. When rendering to a SurfaceTexture, the application can apply subsequent graphics effects to each frame using OpenGL.

Tools support for low-level streaming multimedia will be available in an upcoming release of the Android NDK. Developers can take advantage of a variety of new camera features in Android 4. ZSL exposure, continuous focus, and image zoom let apps capture better still and video images, including during video capture.

Apps can even capture full-resolution snapshots while shooting video. Apps can now set custom metering regions in a camera preview, then manage white balance and exposure dynamically for those regions. For easier focusing and image processing, a face-detection service identifies and tracks faces in a preview and returns their screen coordinates. Media effects for transforming images and video. A set of high-performance transformation filters let developers apply rich effects to any image passed as an OpenGL ES 2.

Developers can adjust color levels and brightness, change backgrounds, sharpen, crop, rotate, add lens distortion, and apply other effects. The transformations are processed by the GPU, so they are fast enough for processing image frames loaded from disk, camera, or video stream. Using the audio remote control API, any music or media app can register to receive media button events from the remote control and then manage play state accordingly.

The application can also supply metadata to the remote control, such as album art or image, play state, track number and description, duration, genre, and more. For high-quality compressed images, the media framework adds support for WebP content. For video, the framework now supports streaming VP8 content. Additionally, developers can now use Matroska containers for Vorbis and VP8 content. Developers can use a framework API to discover and connect directly to nearby devices over a high-performance, secure Wi-Fi peer-to-peer P2P connection.

No internet connection or hotspot is needed. Wi-Fi peer-to-peer P2P opens new opportunities for developers to add innovative features to their applications. Applications can use Wi-Fi P2P to share files, photos, or other media between devices or between a desktop computer and an Android-powered device.

Applications could also use Wi-Fi P2P to stream media content from a peer device such as a digital television or audio player, connect a group of users for gaming, print files, and more. Developers can now build powerful medical applications that use Bluetooth to communicate with wireless devices and sensors in hospitals, fitness centers, homes, and elsewhere.

Applications can collect and manage data from HDP source devices and transmit it to backend medical applications such as records systems, data analysis services, and others. Using a framework API, applications can use Bluetooth to discover nearby devices, establish reliable or streaming data channels, and manage data transmission.

Applications can supply any IEEE Manager to retrieve and interpret health data from Continua-certified devices such as heart-rate monitors, blood meters, thermometers, and scales. A new layout, GridLayout, improves the performance of Android applications by supporting flatter view hierarchies that are faster to layout and render.

Because hierarchies are flatter, developers can also manage alignments between components that are visually related to each other even when they are not logically related, for precise control over application UI.

GridLayout is also specifically designed to be configured by drag-and-drop design tools such as Android Studio. The object lets developers display and manipulate OpenGL ES rendering just as they would a normal view object in the hierarchy, including moving, transforming, and animating the view as needed.

The TextureView object makes it easy for developers to embed camera preview, decoded video, OpenGL game scenes, and more. TextureView can be viewed as a more powerful version of the existing SurfaceView object, since it offers the same benefits of access to a GL rendering surface, with the added advantage of having that surface participate fully in the normal view hierarchy.

All Android-powered devices running Android 4. Developers can take advantage of this to add great UI effects while maintaining optimal performance on high-resolution screens, even on phones. For example, developers can rely on accelerated scaling, rotation, and other 2D operations, as well as accelerated UI components such as TextureView and compositing modes such as filtering, blending, and opacity.

Stylus input, button support, hover events. To help applications distinguish motion events from different sources, the platform adds distinct tool types for stylus, finger, mouse, and eraser. For improved input from multi-button pointing devices, the platform now provides distinct primary, secondary, and tertiary buttons, as well as back and forward buttons. Hover-enter and hover-exit events are also added, for improved navigation and accessibility. Developers can build on these new input features to add powerful interactions to their apps, such as precise drawing and gesturing, handwriting and shape recognition, improved mouse input, and others.

Text services API for integrating spelling checkers. The text services are external to the active IME, so developers can create and distribute dictionaries and suggestion engines that plug into the platform. When an application receives results from a text service — for example, word suggestions — it can display them in a dedicated suggestion popup window directly inside the text view, rather than relying on the IME to display them.

For accessibility services such as screen readers in particular, the platform offers new APIs to query window content, for easier navigation, better feedback, and richer user interfaces. It had all the features of previous versions and some new features were added in this version. Key user features like stability improvements, better camera performance and screen rotation features were introduced in this version.

No new developer feature was added in this version. Its version name is Ice Cream Sandwich. The API level in Android 4. It was released on 29 March Attention reader! Next Difference between Raspberry pi and Beaglebone black. Recommended Articles. Article Contributed By :. Easy Normal Medium Hard Expert. Save Article. Improve Article.

Like Article. Last Updated : 07 Jul, Android 1. It had some basic features including web browser support, camera support, Gmail accounts, Google maps and Youtube application. Although it does not have the official version name like further introduces versions but unofficially it is called Apple Pie. The API level in Android 1. It is not used anymore in the mobile devices.

It was released on 23 September Android 4.



0コメント

  • 1000 / 1000