Human Generated Data

Title

Untitled (two photographs: three military officers standing outside; four military cadets in athletic clothes)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13834

Human Generated Data

Title

Untitled (two photographs: three military officers standing outside; four military cadets in athletic clothes)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13834

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.9
Human 99.9
Person 99.8
Person 99.8
Person 99.8
Person 99.7
Person 98.8
Person 97.2
Collage 96.4
Advertisement 96.4
Poster 96.4
Person 94.2
Person 92
Apparel 73.1
Clothing 73.1
Overcoat 69.7
Coat 69.7
Suit 69.7
Military Uniform 67.6
Military 67.6
Person 67.4
People 66
Mammal 65.4
Horse 65.4
Animal 65.4
Pedestrian 58.8
Tarmac 56.6
Asphalt 56.6

Clarifai
created on 2019-11-16

people 99.6
monochrome 99.4
street 97.9
man 96.6
group together 96.4
adult 95.6
group 95.5
many 93.1
woman 92.2
wear 89.4
child 88.5
military 87.8
black and white 84.5
soldier 83.6
administration 82.9
war 82.7
uniform 80.2
home 80.1
one 77.7
vehicle 77.2

Imagga
created on 2019-11-16

architecture 27.6
city 25.8
building 24.2
window 21.1
urban 19.2
people 17.8
old 17.4
house 16.9
travel 16.2
street 13.8
shop 13.6
wall 12.1
light 12
glass 11.7
business 10.9
silhouette 10.8
history 10.7
sidewalk 10.7
crowd 10.6
scene 10.4
home 10.4
square 9.9
historical 9.4
stone 9.3
tourism 9.1
transportation 9
life 8.9
night 8.9
man 8.8
center 8.8
lamp 8.8
station 8.8
structure 8.4
town 8.3
historic 8.2
road 8.1
religion 8.1
black 8
balcony 7.9
women 7.9
roof 7.9
hall 7.6
fashion 7.5
world 7.4
place 7.4
exterior 7.4
shopping 7.3
graffito 7.3
transport 7.3
mercantile establishment 7.1
adult 7.1
summer 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

street 97.4
text 94.7
black and white 92.4
person 85.2
clothing 82.3
group 68.6
footwear 66.7
monochrome 65.9
people 61.2
man 51.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 54.7%
Fear 45.3%
Happy 45.1%
Angry 45.2%
Disgusted 45.6%
Sad 45%
Surprised 51.7%
Calm 46.8%
Confused 45.2%

AWS Rekognition

Age 30-46
Gender Male, 54.8%
Sad 46%
Happy 45.2%
Fear 45.3%
Calm 45.6%
Angry 45.9%
Disgusted 51.6%
Confused 45.3%
Surprised 45.2%

AWS Rekognition

Age 18-30
Gender Male, 50.3%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Sad 49.5%
Fear 49.5%
Calm 50.4%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 23-37
Gender Female, 52.3%
Angry 45.6%
Sad 48.5%
Confused 45.1%
Calm 50.2%
Surprised 45.1%
Happy 45.2%
Disgusted 45.1%
Fear 45.2%

AWS Rekognition

Age 6-16
Gender Male, 50.3%
Calm 50.3%
Fear 49.5%
Sad 49.6%
Happy 49.5%
Disgusted 49.5%
Surprised 49.5%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 23-37
Gender Male, 50.3%
Sad 50.2%
Surprised 49.5%
Confused 49.5%
Angry 49.5%
Calm 49.5%
Fear 49.7%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 26-40
Gender Male, 53.8%
Angry 45.3%
Disgusted 45.3%
Happy 52.3%
Calm 45.7%
Sad 45.3%
Surprised 45.3%
Fear 45.7%
Confused 45.1%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Disgusted 49.5%
Fear 49.7%
Confused 49.5%
Angry 49.6%
Calm 50%
Surprised 49.6%
Sad 49.6%
Happy 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 50.5%
Angry 49.5%
Sad 49.5%
Happy 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 99.9%
Horse 65.4%

Categories