Human Generated Data

Title

Untitled (people eating at party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17206

Human Generated Data

Title

Untitled (people eating at party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17206

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 99.3
Person 99.2
Person 98.9
Person 98
Person 97.9
Person 97.4
Person 96
Person 95.2
Person 94.4
Crowd 80.2
Meal 79.4
Food 79.4
Clothing 77.2
Apparel 77.2
People 70.9
Plant 70
Sitting 69.6
Sunglasses 69.1
Accessories 69.1
Accessory 69.1
Indoors 67.6
Person 65.2
Flower 63.2
Blossom 63.2
Photography 60.2
Photo 60.2
Suit 59.2
Coat 59.2
Overcoat 59.2
Dish 58.1
Room 57.4
Cafeteria 57.2
Restaurant 57.2
Flower Bouquet 55.3
Flower Arrangement 55.3
Face 55.2

Clarifai
created on 2023-10-29

people 99.9
many 99
group 98.6
group together 97.9
child 96.9
woman 96.5
adult 96.5
man 95
monochrome 92.3
recreation 91.5
education 90.9
administration 90.4
boy 89.1
leader 89
furniture 88.8
music 87.2
audience 84.2
wear 83.5
crowd 82.3
elderly 82.1

Imagga
created on 2022-02-26

person 33.2
people 32.3
man 29.5
groom 22.8
male 21.3
group 19.3
team 18.8
men 17.2
meeting 16.9
adult 16.7
together 16.6
couple 16.5
business 16.4
businessman 15
patient 14.2
teamwork 13.9
happy 13.1
senior 13.1
room 12.7
teacher 12.5
nurse 12.1
table 12.1
sitting 12
work 11.9
happiness 11.7
brass 11.1
fan 11
case 11
suit 10.8
indoors 10.5
husband 10.5
office 10.4
desk 10.4
education 10.4
women 10.3
mature 10.2
two 10.2
old 9.7
new 9.7
colleagues 9.7
photographer 9.7
celebration 9.6
businesspeople 9.5
wife 9.5
smiling 9.4
sick person 9.3
spectator 9.1
cheerful 8.9
home 8.8
student 8.8
talking 8.5
human 8.2
indoor 8.2
businesswoman 8.2
follower 8.1
religion 8.1
worker 8
job 8
working 7.9
lifestyle 7.9
bride 7.8
party 7.7
chair 7.7
married 7.7
wind instrument 7.6
hand 7.6
world 7.5
laptop 7.4
musical instrument 7.4
life 7.4
holiday 7.2
portrait 7.1
family 7.1
love 7.1
medical 7.1
day 7.1
classroom 7
professional 7

Google
created on 2022-02-26

Black 89.6
Black-and-white 86.2
Style 84
Adaptation 79.2
Monochrome photography 75.9
Monochrome 75.4
Event 72.8
Room 68.3
Vintage clothing 68.3
Chair 66
Crew 64.3
Crowd 63.1
Hat 62.4
Stock photography 61.9
History 61.9
Sitting 61.7
Font 59.4
Team 59.2
Suit 56.4
Fun 55.1

Microsoft
created on 2022-02-26

person 99.8
text 91.9
black and white 80.7
people 59.2
man 53.8
clothing 51.6
crowd 22.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Calm 99.8%
Happy 0.1%
Confused 0%
Sad 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Female, 85%
Sad 82.1%
Calm 6.9%
Happy 4.1%
Surprised 2.5%
Disgusted 1.8%
Fear 1.1%
Confused 0.9%
Angry 0.6%

AWS Rekognition

Age 35-43
Gender Female, 81.5%
Calm 99.9%
Surprised 0%
Sad 0%
Angry 0%
Happy 0%
Fear 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 45-53
Gender Female, 89.8%
Calm 97.1%
Sad 1.8%
Happy 0.6%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Female, 66.8%
Calm 82.4%
Sad 10.5%
Fear 1.9%
Happy 1.6%
Disgusted 1.3%
Confused 1.1%
Surprised 0.6%
Angry 0.5%

AWS Rekognition

Age 50-58
Gender Male, 94.1%
Sad 41.8%
Confused 32.1%
Calm 23.2%
Happy 1.1%
Disgusted 0.6%
Surprised 0.5%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Person 99.5%
Person 99.3%
Person 99.2%
Person 98.9%
Person 98%
Person 97.9%
Person 97.4%
Person 96%
Person 95.2%
Person 94.4%
Person 65.2%
Sunglasses 69.1%