Human Generated Data

Title

Untitled (women and children on steps)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17049

Human Generated Data

Title

Untitled (women and children on steps)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17049

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.4
Human 99.4
Person 95.4
Person 95.4
Person 92
Person 90.5
Clothing 90.2
Apparel 90.2
Person 89.2
Person 86
People 73.8
Crowd 67.4
Female 64.5
Girl 63.1
Person 63.1
Advertisement 62.9
Poster 58.7
Suit 57.8
Coat 57.8
Overcoat 57.8
Person 52.7
Person 47.7

Clarifai
created on 2023-10-29

people 99.9
group together 99.5
group 98.6
adult 98.4
man 96.7
many 95.7
child 94.9
several 94.9
furniture 94.5
monochrome 92.6
woman 92.5
administration 91.5
recreation 91.4
war 91.1
music 85
actor 83.8
military 83.2
sit 81.9
boy 81.8
outfit 79.5

Imagga
created on 2022-02-26

shop 54.6
salon 54.3
mercantile establishment 41
barbershop 40
place of business 27.4
man 21.7
people 18.4
shoe shop 17.7
men 14.6
establishment 13.7
person 13.5
male 12.8
city 12.5
room 12
home 11.2
adult 11
black 10.8
celebration 10.4
art 9.9
interior 9.7
mask 9.7
medical 9.7
medicine 9.7
urban 9.6
toyshop 9.6
party 9.5
decoration 9.4
indoor 9.1
indoors 8.8
table 8.7
women 8.7
work 8.3
life 8.3
history 8
sculpture 8
glass 7.8
portrait 7.8
hospital 7.5
traditional 7.5
tourism 7.4
business 7.3
kitchen 7.2
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.4
person 97.6
clothing 96.9
black and white 82.9
man 76.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 59.1%
Calm 98.9%
Sad 0.5%
Angry 0.2%
Surprised 0.2%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 45-53
Gender Male, 86.5%
Calm 60%
Sad 38.8%
Confused 0.3%
Happy 0.3%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Female, 99.4%
Surprised 82%
Calm 8%
Happy 5%
Sad 1.9%
Fear 1.3%
Angry 0.7%
Confused 0.7%
Disgusted 0.5%

AWS Rekognition

Age 34-42
Gender Male, 97.9%
Calm 99.5%
Surprised 0.3%
Fear 0%
Angry 0%
Disgusted 0%
Sad 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 26-36
Gender Male, 63.4%
Calm 98.7%
Sad 1%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 24-34
Gender Female, 83.2%
Calm 65.4%
Sad 30.8%
Happy 1.5%
Confused 1%
Disgusted 0.4%
Angry 0.4%
Fear 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 95.4%
Person 95.4%
Person 92%
Person 90.5%
Person 89.2%
Person 86%
Person 63.1%
Person 52.7%
Person 47.7%

Categories