Human Generated Data

Title

Untitled (sailors saluting to officer aboard HMS Manchester)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8625

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (sailors saluting to officer aboard HMS Manchester)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.7
Person 99.7
Person 99.6
Person 99.5
Person 99.4
Person 99.1
Person 93.2
Clothing 90.9
Shorts 90.9
Apparel 90.9
Military 82.8
Shoe 82.8
Footwear 82.8
Officer 74.1
Military Uniform 74.1
Shoe 63.5
Building 57.4
Sailor Suit 57.1

Imagga
created on 2022-01-09

ship 27.4
industrial 25.4
stretcher 25.3
industry 23.9
conveyance 23.8
litter 20.2
power 20.1
factory 19.3
vessel 17.1
steel 16.8
sky 15.9
metal 15.3
warship 14.4
ocean 14.1
military uniform 13.9
construction 13.7
uniform 13.5
tank 13.3
boat 12.9
transport 12.8
building 12.7
energy 12.6
transportation 12.5
port 12.5
sea 12.5
chairlift 12.3
business 12.1
travel 12
technology 11.9
machine 11.8
work 11.8
stage 11.7
pipe 11.7
military vehicle 11.3
deck 11.2
equipment 11.1
production 10.7
water 10.7
vacation 10.6
battleship 10.6
military 10.6
vehicle 10.2
liner 10.1
war 9.9
ski tow 9.8
station 9.8
cruise 9.7
plant 9.7
platform 9.7
urban 9.6
oil 9.3
old 9.1
environment 9
craft 8.7
tube 8.7
gas 8.7
harbor 8.7
heavy 8.6
device 8.6
control 8.6
engineering 8.6
clothing 8.5
city 8.3
air 8.3
tourism 8.2
protection 8.2
man 8.1
refinery 7.9
pilot 7.8
engine 7.7
pollution 7.7
storage 7.6
passenger 7.5
structure 7.4
street 7.4
jet 7.3
aircraft carrier 7.3
tourist 7.2
tower 7.2
history 7.2
summer 7.1
architecture 7
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.4
ship 95.8
clothing 92.5
black and white 89.2
person 86.9
standing 80.8
man 76.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 98.7%
Sad 69.6%
Calm 13.5%
Happy 11.6%
Disgusted 1.7%
Angry 1.4%
Confused 1%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 82.8%

Captions

Microsoft

a group of people standing in front of a building 76.9%
a group of people standing in front of a store 67.5%
a group of people standing next to a building 67.4%

Text analysis

Amazon

A
18082 A
18082
18082A.

Google

18082 A • 18082A.
18082A.
18082
A