Human Generated Data

Title

Untitled (two people serving themselves food at buffet table in dining room with man in white suit reaching across table in background)

Date

1940-1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9066

Human Generated Data

Title

Untitled (two people serving themselves food at buffet table in dining room with man in white suit reaching across table in background)

People

Artist: Martin Schweig, American 20th century

Date

1940-1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.4
Person 99.4
Person 98.2
Person 97.2
Food 96.5
Meal 96.5
Dish 88.2
Apparel 78.1
Clothing 78.1
Plant 78
Person 60.9
Cafeteria 60.3
Restaurant 60.3
Shorts 55.6

Imagga
created on 2022-01-23

man 39.7
person 39.3
patient 38.8
people 31.8
male 29.8
nurse 24.3
adult 24.3
home 21.5
salon 21.2
smiling 21
medical 20.3
indoors 20.2
couple 20
doctor 19.7
senior 19.7
sick person 19.2
happy 18.8
case 18.8
sitting 18
hospital 17.2
hairdresser 15.9
coat 15.8
together 15.8
professional 15.4
lab coat 15.3
health 15.3
cheerful 14.6
table 14
room 13.8
two people 13.6
clinic 13.2
holding 13.2
mature 13
lifestyle 13
men 12.9
office 12.9
team 12.6
30s 12.5
work 11.9
women 11.9
worker 11.9
two 11.9
family 11.6
working 11.5
talking 11.4
meeting 11.3
clothing 11.3
love 11.1
smile 10.7
husband 10.6
medicine 10.6
wife 10.4
day 10.2
happiness 10.2
food 10
care 9.9
kitchen 9.8
businessman 9.7
portrait 9.7
illness 9.5
dinner 9.5
surgeon 9.3
hand 9.1
to 8.9
50s 8.8
40s 8.8
child 8.8
retired 8.7
retirement 8.6
adults 8.5
business 8.5
specialist 8.4
mother 8.4
teamwork 8.3
color 8.3
meal 8.3
occupation 8.3
laptop 8.2
computer 8
job 8
casual clothing 7.8
surgery 7.8
having 7.8
attractive 7.7
domestic 7.6
casual 7.6
grandma 7.6
togetherness 7.6
horizontal 7.5
drink 7.5
restaurant 7.5
technology 7.4
businesswoman 7.3
looking 7.2
suit 7.2
handsome 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98
text 97.7
food 94.2
black and white 93.8
clothing 87.2
monochrome 57.3
preparing 40.6
cooking 22.2

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 80.8%
Calm 99.9%
Sad 0%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 48-56
Gender Female, 94.1%
Calm 87.7%
Sad 9.6%
Confused 1.4%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%
Happy 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a person standing in front of a cake 33.6%

Text analysis

Amazon

MJI3
٢٢
MJI3 YESTAD ОСЛИА
YESTAD
ОСЛИА

Google

02MA
YT3RA2
MJIH YT3RA2 02MA
MJIH