Human Generated Data

Title

Untitled (children's party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17848

Human Generated Data

Title

Untitled (children's party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17848

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.7
Person 99.7
Human 99.7
Person 98.3
Person 97.3
Person 97
Nature 93.8
Outdoors 93.3
Person 86.7
Person 84.6
Clothing 82.6
Apparel 82.6
Chair 81.3
Meal 79.7
Food 79.7
Yard 79.7
Female 76.7
Plant 76.7
Grass 74
Person 71.9
Dining Table 71
Table 71
People 70.8
Suit 70.7
Coat 70.7
Overcoat 70.7
Person 69.7
Person 69.1
Kid 67.5
Child 67.5
Animal 65.8
Person 64.5
Girl 64.4
Sand 64.2
Crowd 62.5
Shorts 62.3
Helmet 62.2
Play 58
Eating 57.4
Woman 57.2
Texture 56.6
Countryside 55.9

Clarifai
created on 2023-10-29

people 100
group together 99.4
many 98.3
adult 97.8
group 97.3
seat 96.1
man 95.6
recreation 92.9
furniture 92.9
bench 91.5
wear 91.3
child 91.2
cavalry 90.7
baseball 90.7
chair 90.4
outfit 90.2
nostalgia 90.2
administration 89.8
monochrome 87.3
home 86.7

Imagga
created on 2022-02-26

wheelchair 53.8
chair 38.2
carriage 31.3
seat 26.6
wheeled vehicle 24.5
vehicle 24.2
man 23.5
cart 20.7
outdoors 18.8
people 18.4
wheel 16.2
adult 16.2
old 16
transportation 15.2
male 15
jinrikisha 14.9
furniture 14.5
disabled 13.8
conveyance 13.1
tricycle 12.5
care 12.3
person 12.2
wagon 12
horse 11.9
city 11.6
street 11
car 10.7
outdoor 10.7
park 10.7
travel 10.6
couple 10.5
portrait 10.4
men 10.3
help 10.2
lifestyle 10.1
transport 10
road 9.9
sick 9.7
building 9.6
illness 9.5
sport 9.5
senior 9.4
attractive 9.1
barrow 8.9
urban 8.7
day 8.6
elderly 8.6
sitting 8.6
health 8.3
looking 8
women 7.9
disability 7.9
handicap 7.9
ride 7.8
outside 7.7
relax 7.6
furnishing 7.5
bench 7.4
vacation 7.4
speed 7.3
handcart 7.3
smile 7.1
love 7.1
medical 7.1
support 7.1
sky 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 98.9
text 96.1
black and white 79.9
person 64.3
team 25.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Happy 72.2%
Calm 11%
Surprised 9.2%
Fear 3.8%
Sad 1.4%
Angry 1.2%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 22-30
Gender Male, 74.4%
Calm 90.8%
Sad 5.4%
Happy 1.5%
Confused 0.7%
Angry 0.6%
Surprised 0.5%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 18-24
Gender Male, 88.8%
Calm 84.5%
Sad 5.9%
Happy 3.1%
Angry 2%
Surprised 1.9%
Disgusted 1.6%
Fear 0.6%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Helmet
Person 99.7%
Person 98.3%
Person 97.3%
Person 97%
Person 86.7%
Person 84.6%
Person 71.9%
Person 69.7%
Person 69.1%
Person 64.5%
Chair 81.3%
Helmet 62.2%