Human Generated Data

Title

Untitled (man in military uniform reading Christmas cards beside woman and girl)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9234

Human Generated Data

Title

Untitled (man in military uniform reading Christmas cards beside woman and girl)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.1
Person 99.1
Person 98.5
Person 94.5
Apparel 87.9
Clothing 87.9
Home Decor 85.1
Leisure Activities 83.4
Chef 66.4
Sleeve 61.6
Musical Instrument 55.9
Guitar 55.9
Person 52.4

Imagga
created on 2022-01-23

brass 50.8
man 43.6
wind instrument 41.4
person 37
male 34.8
people 29.5
cornet 29.3
musical instrument 29.1
senior 26.2
adult 24.2
men 23.2
happy 22.5
mature 21.4
medical 20.3
smiling 19.5
professional 18.7
indoors 18.4
patient 18.2
job 17.7
doctor 16.9
businessman 16.8
smile 15.7
worker 15.5
work 15.5
health 15.3
home 15.1
hospital 15
couple 14.8
business 14.6
lifestyle 14.4
looking 13.6
office 12.8
device 12.5
happiness 12.5
elderly 12.4
care 12.3
medicine 12.3
nurse 12.3
portrait 12.3
teacher 11.9
women 11.9
holding 11.5
coat 11.5
bass 11.4
group 11.3
sitting 11.2
casual 11
team 10.7
handsome 10.7
stethoscope 10.2
black 10.2
camera 10.2
two 10.2
student 10
old 9.7
cheerful 9.7
older 9.7
retired 9.7
retirement 9.6
desk 9.4
room 9.2
occupation 9.2
executive 8.9
to 8.8
60s 8.8
together 8.8
middle aged 8.8
lab 8.7
standing 8.7
exam 8.6
clinic 8.6
profession 8.6
illness 8.6
horizontal 8.4
hand 8.3
human 8.2
laptop 8.2
technology 8.2
surgeon 8
working 7.9
day 7.8
boy 7.8
physician 7.8
education 7.8
laboratory 7.7
husband 7.6
house 7.5
inside 7.4
new 7.3
success 7.2
computer 7.2
suit 7.2
practitioner 7.2
family 7.1
interior 7.1
horn 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.7
man 96.4
text 89.1
clothing 84.8
human face 69.7
drawing 65.2
smile 61
woman 53.4

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 80.5%
Calm 99.5%
Sad 0.2%
Happy 0.1%
Surprised 0.1%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 28-38
Gender Male, 58.5%
Sad 44.7%
Calm 29.6%
Happy 19.7%
Confused 1.7%
Fear 1.4%
Angry 1.1%
Disgusted 1%
Surprised 0.9%

AWS Rekognition

Age 16-24
Gender Female, 92.4%
Happy 85.3%
Calm 6.5%
Sad 3.8%
Fear 1.4%
Surprised 1.3%
Angry 1.1%
Disgusted 0.5%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a man standing in front of a group of people posing for the camera 64.5%
a man standing in front of a table 64.4%
a man standing in front of a group of people posing for a photo 58.8%

Text analysis

Amazon

st