Human Generated Data

Title

Untitled (king and queen on podium with group in formal attire)

Date

c. 1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5722

Human Generated Data

Title

Untitled (king and queen on podium with group in formal attire)

People

Artist: Durette Studio, American 20th century

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.4
Person 99.4
Person 99.2
Person 98.8
Person 98.5
Person 98.1
Person 97.7
Person 97.6
Person 97.4
Person 97.4
Person 95.1
Person 92.6
Person 84.3
Indoors 82.1
Room 80.1
Person 74.1
Clothing 72.4
Apparel 72.4
Furniture 71.3
Pub 70.3
People 70
Drink 67.9
Beverage 67.9
Chair 66.1
Crowd 64.9
Person 63.3
Person 62
Bar Counter 59.3
Suit 56.5
Overcoat 56.5
Coat 56.5

Clarifai
created on 2019-06-01

people 99.7
group 98.9
man 96.5
many 95.9
adult 95.9
furniture 95.2
group together 94.2
woman 93.4
room 92.8
chair 91.2
indoors 87.7
exhibition 85.9
crowd 85.1
leader 84
administration 83.7
monochrome 83.2
child 82.2
several 81.9
wear 81.3
illustration 76.9

Imagga
created on 2019-06-01

architecture 26.5
city 23.3
history 19.7
building 15.9
people 15
monument 14.9
travel 14.8
historic 14.7
house 14.2
old 13.9
tourism 13.2
urban 13.1
famous 13
business 12.7
landmark 12.6
sculpture 12.6
art 12.5
marble 12.3
ancient 12.1
structure 12
counter 11.5
design 11.2
drawing 11.2
statue 10.8
shop 10.4
historical 10.3
church 10.2
light 10
office 10
home 9.6
facade 9.5
life 9.1
religion 9
sky 8.9
sketch 8.9
interior 8.8
window 8.7
scene 8.6
mercantile establishment 8.6
construction 8.5
column 8.2
adult 8.2
men 7.7
modern 7.7
culture 7.7
stone 7.7
god 7.6
capital 7.6
room 7.5
destination 7.5
symbol 7.4
group 7.2
tourist 7.2
national 7.2
women 7.1
table 7.1
male 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

indoor 95.6
person 89.1
window 81.3
black and white 73.9

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Confused 45.7%
Surprised 46.3%
Calm 47%
Sad 46.3%
Happy 45.9%
Disgusted 47.5%
Angry 46.3%

AWS Rekognition

Age 20-38
Gender Female, 54%
Angry 46.6%
Happy 45.3%
Confused 45.9%
Calm 49.7%
Surprised 45.9%
Sad 46.1%
Disgusted 45.5%

AWS Rekognition

Age 30-47
Gender Female, 50.3%
Angry 49.6%
Confused 49.5%
Surprised 49.5%
Happy 50.1%
Sad 49.7%
Calm 49.5%
Disgusted 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Surprised 49.6%
Confused 49.5%
Disgusted 49.5%
Happy 49.6%
Sad 49.5%
Calm 50.1%
Angry 49.6%

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Disgusted 45.5%
Happy 45.5%
Sad 46.2%
Calm 51.6%
Angry 45.3%
Surprised 45.5%
Confused 45.4%

AWS Rekognition

Age 29-45
Gender Female, 50.8%
Disgusted 49%
Surprised 45.8%
Angry 45.5%
Confused 45.7%
Sad 45.3%
Calm 48.3%
Happy 45.3%

AWS Rekognition

Age 23-38
Gender Female, 53.9%
Angry 46.5%
Sad 49.2%
Happy 45.1%
Calm 47%
Confused 45.9%
Disgusted 45.5%
Surprised 45.8%

AWS Rekognition

Age 45-63
Gender Female, 54.3%
Disgusted 45.1%
Happy 52.1%
Surprised 45.3%
Sad 46.9%
Angry 45.2%
Confused 45.1%
Calm 45.3%

AWS Rekognition

Age 16-27
Gender Female, 52.1%
Sad 52%
Happy 45.4%
Surprised 45.4%
Calm 45.9%
Disgusted 45.3%
Confused 45.5%
Angry 45.5%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people standing in front of a window 48.5%
a group of people in front of a window 47.2%
a group of people standing in front of a store window 37.4%

Text analysis

Amazon

LRAE