Human Generated Data

Title

Untitled (group of people walking across airfield)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7045

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people walking across airfield)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.4
Person 99.4
Person 98.9
Person 98.7
Person 98.5
Person 98.1
Airplane 98.1
Vehicle 98.1
Transportation 98.1
Aircraft 98.1
Person 98
Person 97.9
Person 97.3
Person 92.5
Airfield 89.2
Airport 89.2
Person 88.8
Shorts 87.4
Clothing 87.4
Apparel 87.4
Landing 82
Person 77
Wheel 72.7
Machine 72.7
People 64.1
Person 63
Person 62.2
Jet 59.4

Imagga
created on 2021-12-15

warship 39.8
aircraft carrier 34.4
ship 33.9
military vehicle 33.4
jet 27.7
sky 26.8
travel 26.8
chairlift 25.3
sea 23.5
snow 22.1
vehicle 21.8
water 21.4
winter 21.3
vessel 20.9
ski tow 20.4
airport 18.8
landscape 18.6
ocean 17.7
conveyance 16.6
beach 14.8
cold 14.6
mountain 14.6
airplane 14.6
slope 14.2
tourism 14
battleship 13.8
transportation 13.5
vacation 13.1
airfield 13
cloud 12.9
ski slope 12.9
river 12.5
craft 12.3
sport 12.1
air 12
ski 11.9
sand 11.3
clouds 11
bay 10.9
plane 10.6
aircraft 10.6
port 10.6
scenic 10.5
sun 10.5
summer 10.3
fly 10.3
wing 10.1
transport 10
industrial 10
city 10
coast 9.9
outdoors 9.7
facility 9.6
flying 9.5
industry 9.4
outdoor 9.2
tourist 9.1
structure 8.9
aviation 8.8
building 8.7
flight 8.6
construction 8.6
recreation 8.1
geological formation 8.1
business 7.9
holiday 7.9
alpine 7.8
black 7.8
architecture 7.8
wave 7.8
mountains 7.4
environment 7.4
speed 7.3
lake 7.3
horizon 7.2
tower 7.2
trees 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.9
outdoor 98.7
black and white 92.8
sky 72.3
person 71.2
white 65
old 61.4
people 56.1
vintage 37.7

Face analysis

Amazon

Google

AWS Rekognition

Age 33-49
Gender Female, 50.1%
Calm 89.8%
Sad 5.6%
Happy 1.3%
Confused 1.1%
Angry 1%
Surprised 0.6%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 13-25
Gender Male, 67.6%
Surprised 47.2%
Happy 34.7%
Fear 13%
Calm 1.7%
Angry 1.4%
Confused 1%
Sad 0.5%
Disgusted 0.5%

AWS Rekognition

Age 47-65
Gender Male, 74.6%
Calm 66.8%
Happy 13.5%
Surprised 11.8%
Sad 4.7%
Confused 1.7%
Fear 0.6%
Disgusted 0.5%
Angry 0.3%

AWS Rekognition

Age 31-47
Gender Male, 50.2%
Sad 48.8%
Calm 45.6%
Confused 2.7%
Angry 0.8%
Happy 0.7%
Surprised 0.6%
Fear 0.5%
Disgusted 0.3%

AWS Rekognition

Age 23-35
Gender Male, 76.8%
Happy 65.6%
Sad 9.9%
Calm 8.5%
Angry 7.2%
Confused 4.2%
Disgusted 3.8%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 50-68
Gender Female, 59.9%
Calm 76.6%
Happy 10.4%
Sad 9.5%
Angry 1.5%
Confused 1.2%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 13-23
Gender Female, 85%
Calm 63.6%
Sad 31.8%
Happy 2%
Angry 1.1%
Confused 0.7%
Surprised 0.5%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Male, 67.4%
Calm 98.7%
Sad 0.6%
Angry 0.3%
Happy 0.3%
Surprised 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 47-65
Gender Male, 56.4%
Calm 99.3%
Happy 0.5%
Angry 0.2%
Surprised 0%
Sad 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 36-54
Gender Male, 93.2%
Calm 82.6%
Sad 10.6%
Confused 3.1%
Happy 1.7%
Surprised 1%
Angry 0.6%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 25-39
Gender Female, 72.8%
Confused 57.6%
Sad 24.5%
Calm 12.4%
Happy 2.4%
Fear 1.1%
Surprised 1.1%
Angry 0.6%
Disgusted 0.4%

AWS Rekognition

Age 19-31
Gender Female, 54.6%
Calm 50.9%
Fear 24.7%
Sad 10.9%
Happy 4.7%
Surprised 4.2%
Angry 2.2%
Confused 2%
Disgusted 0.3%

AWS Rekognition

Age 48-66
Gender Female, 52.4%
Calm 88.4%
Happy 4.8%
Sad 2.2%
Confused 1.9%
Angry 1.2%
Surprised 0.8%
Disgusted 0.6%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.4%
Airplane 98.1%
Wheel 72.7%

Captions

Microsoft

a vintage photo of a group of people standing around a plane 91.4%
a group of people standing around a plane 91%
a vintage photo of a group of people standing in front of a plane 87.8%

Text analysis

Amazon

NATIONAL
NATIONAL AIRLINES
AIRLINES
38
KODVK-SEEIA
ISS

Google

38
38 YT37A2 AGON
YT37A2
AGON