Human Generated Data

Title

Untitled (line of guests standing in buffet line at fancy event)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9247

Human Generated Data

Title

Untitled (line of guests standing in buffet line at fancy event)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9247

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 98.4
Person 97.8
Person 97.5
Person 95.8
Person 93.8
Person 92.3
Meal 92
Food 92
Person 89.6
Clothing 87.4
Apparel 87.4
Dish 81.3
People 76.4
Person 72.8
Person 71.5
Person 70.8
Lamp 66.6
Photography 62.8
Photo 62.8
Person 62.3
Cafeteria 61.1
Restaurant 61.1
Gown 59.7
Fashion 59.7
Robe 59
Wedding 57.5
Table 56.7
Furniture 56.7
Chandelier 55.7
Leisure Activities 55.2

Clarifai
created on 2023-10-27

people 99.9
group 99.5
adult 98.1
man 97.1
group together 97
woman 96.7
many 96.4
monochrome 96
leader 92.9
administration 91.7
several 91.2
furniture 90.4
military 89.8
child 86.8
war 85.4
illustration 85.3
commerce 81.5
sit 80.8
recreation 80.2
medical practitioner 79.8

Imagga
created on 2022-01-23

man 33.6
person 26.5
people 25.6
male 24.1
shop 20.2
home 19.1
adult 18.9
smiling 18.1
salon 18
happy 17.5
room 16.9
lifestyle 16.6
counter 16.2
indoors 14.9
interior 14.1
business 14
classroom 13.9
sitting 13.7
smile 13.5
seller 13.3
education 13
clinic 12.9
women 12.6
office 12.6
barbershop 12.3
teacher 12.3
men 12
indoor 11.9
kitchen 11.6
class 11.6
cheerful 11.4
student 11.3
mercantile establishment 11.2
school 10.9
work 10.6
modern 10.5
senior 10.3
casual 10.2
patient 9.9
family 9.8
working 9.7
businessman 9.7
portrait 9.7
blackboard 9.6
standing 9.6
day 9.4
house 9.2
hospital 9.1
nurse 9
technology 8.9
chair 8.8
looking 8.8
boy 8.7
wife 8.5
two 8.5
mature 8.4
health 8.3
mother 8.3
back 8.3
holding 8.2
table 7.9
couple 7.8
happiness 7.8
color 7.8
child 7.8
life 7.7
30s 7.7
old 7.7
husband 7.6
adults 7.6
doctor 7.5
place of business 7.5
professional 7.4
stall 7.4
camera 7.4
lady 7.3
black 7.2
team 7.2
to 7.1
paper 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.9
person 96.9
clothing 79.5
people 72.5
funeral 68.9
table 64.7
man 63.3
black and white 50.3
several 11.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 98.5%
Calm 86.7%
Happy 6.7%
Sad 2.2%
Confused 1.2%
Surprised 1.1%
Fear 0.8%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 38-46
Gender Male, 96.3%
Sad 91.8%
Confused 3.8%
Calm 1.8%
Happy 1.2%
Disgusted 0.6%
Angry 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Male, 77.3%
Calm 47.7%
Happy 34.6%
Sad 7.2%
Angry 5.3%
Surprised 3.1%
Disgusted 1.1%
Fear 0.6%
Confused 0.3%

AWS Rekognition

Age 27-37
Gender Male, 95%
Happy 85.1%
Calm 8.5%
Sad 5.1%
Surprised 0.4%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Text analysis

Amazon

08
KODAK-SELA

Google

YT37A2- AO
YT37A2-
AO