Human Generated Data

Title

Untitled (people at banquet)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20042

Human Generated Data

Title

Untitled (people at banquet)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20042

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.3
Person 99.3
Person 98.8
Person 98.6
Person 98.5
Sitting 98.5
Tie 97.5
Accessories 97.5
Accessory 97.5
Person 96.4
Person 96.1
Person 95.2
Crowd 91.1
Person 88.2
Indoors 86.6
Room 86
Tie 84.7
Furniture 81.2
Clothing 81
Apparel 81
People 77
Meeting Room 70.2
Conference Room 70.2
Table 68.3
Meal 66.4
Food 66.4
Chair 64.8
Senior Citizen 63.6
Press Conference 60.6
Suit 57.4
Coat 57.4
Overcoat 57.4
Dining Table 56.8
Audience 56.8
Leisure Activities 56.6
Portrait 55.6
Photography 55.6
Face 55.6
Photo 55.6
Person 45

Clarifai
created on 2023-10-22

people 99.7
group 98.4
woman 97.8
man 96.7
leader 96.4
indoors 96.1
group together 96.1
adult 95.6
child 91.6
room 88
chair 87.9
monochrome 87.8
celebration 87.4
ceremony 87.1
administration 86.5
many 85.7
wedding 84.3
sit 84.3
elderly 82.1
military 82

Imagga
created on 2022-03-05

senior 43.1
man 37.6
teacher 36.5
male 34
people 34
couple 33.1
person 32.6
adult 31.5
indoors 28.1
home 27.9
educator 25.2
mature 25.1
sitting 24.9
elderly 24.9
professional 24.8
entrepreneur 24.3
happy 23.8
men 23.2
executive 22.1
room 21.8
smiling 21.7
together 21
old 20.2
table 19.9
women 19
retired 18.4
office 17.1
looking 16.8
husband 16.2
metropolitan 16.2
meeting 16
smile 15.7
retirement 15.4
wife 15.2
laptop 14.7
cheerful 14.6
group 14.5
lifestyle 14.4
happiness 14.1
computer 13.6
lunch 13.4
family 13.3
portrait 12.9
grandfather 12.8
indoor 12.8
business 12.7
businessman 12.4
restaurant 12.2
occupation 11.9
love 11.8
director 11.7
married 11.5
face 11.4
work 11
70s 10.8
grandmother 10.8
handsome 10.7
care 10.7
two people 10.7
older 10.7
interior 10.6
modern 10.5
chair 10.4
desk 10.4
hall 10.1
aged 9.9
to 9.7
discussion 9.7
colleagues 9.7
medical 9.7
casual 9.3
salon 9.1
holding 9.1
health 9
worker 8.9
40s 8.8
middle aged 8.8
look 8.8
mid adult 8.7
grandma 8.6
businesspeople 8.5
togetherness 8.5
inside 8.3
student 8.2
gray 8.1
team 8.1
hospital 8
working 7.9
60s 7.8
color 7.8
affectionate 7.7
drinking 7.7
two 7.6
talking 7.6
drink 7.5
doctor 7.5
clothing 7.4
help 7.4
patient 7.4
teamwork 7.4
glasses 7.4
life 7.4
businesswoman 7.3
success 7.2
suit 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.6
indoor 92.6
group 78
people 74.5
clothing 73.8
crowd 2.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Calm 55.8%
Sad 40.8%
Confused 2.4%
Happy 0.4%
Disgusted 0.3%
Angry 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Female, 98.8%
Calm 95.2%
Sad 3.9%
Surprised 0.4%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 90.1%
Calm 99.8%
Sad 0%
Happy 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 96.4%
Calm 98.6%
Happy 0.3%
Confused 0.2%
Surprised 0.2%
Sad 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 95.5%
Calm 97.5%
Angry 0.6%
Sad 0.6%
Confused 0.4%
Surprised 0.4%
Happy 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Female, 70.3%
Calm 92%
Surprised 2.6%
Sad 1.5%
Confused 1.4%
Disgusted 0.8%
Happy 0.8%
Angry 0.6%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 89.5%
Calm 96.6%
Happy 1.7%
Sad 0.7%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Chair
Person 99.4%
Person 99.3%
Person 99.3%
Person 98.8%
Person 98.6%
Person 98.5%
Person 96.4%
Person 96.1%
Person 95.2%
Person 88.2%
Person 45%
Tie 97.5%
Tie 84.7%
Chair 64.8%

Text analysis

Amazon

8
KODVK
KODVK -2VLE
-2VLE