Human Generated Data

Title

Untitled (eight family members in formal wear posed in living room)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9350

Human Generated Data

Title

Untitled (eight family members in formal wear posed in living room)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9350

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 98.9
Person 98.5
Person 98.3
Person 98.2
Person 98
Person 96.6
Person 96.1
Clothing 94.1
Apparel 94.1
Tie 93.2
Accessories 93.2
Accessory 93.2
Clinic 87.4
Shirt 73.2
Face 67.8
People 66.4
Sunglasses 65.3
Doctor 58.3
Portrait 57.3
Photography 57.3
Photo 57.3
Nurse 57.1

Clarifai
created on 2023-10-27

people 99.9
group 99.7
adult 99
medical practitioner 98.9
man 97.6
group together 96.7
several 96.2
woman 95.2
leader 95.1
five 94.8
outerwear 94.4
many 92.5
administration 92.4
uniform 89.1
three 86.8
outfit 86.2
elderly 85.6
four 83.5
ailment 82.9
wear 82.8

Imagga
created on 2022-01-23

patient 83.8
person 60.2
case 51.5
sick person 50.5
nurse 42.3
barbershop 42.1
man 38.3
people 32.4
shop 31.9
male 28.4
medical 27.4
adult 25.7
mercantile establishment 25.3
hospital 23.9
professional 21.2
doctor 20.7
happy 20.7
smiling 19.5
health 19.5
room 19.2
men 18.9
indoors 18.5
couple 18.3
senior 17.8
office 17.7
home 17.6
place of business 16.8
businessman 15.9
illness 15.3
job 15
work 14.9
occupation 14.7
business 14.6
team 14.3
specialist 13.8
smile 13.5
family 13.4
talking 13.3
working 13.3
medicine 13.2
care 13.2
clinic 13.1
lifestyle 13
coat 12.8
worker 12.7
women 12.7
two people 12.6
sick 12.6
sitting 12
two 11.9
happiness 11.8
hairdresser 11.7
30s 11.5
bed 11.4
cheerful 11.4
group 11.3
mature 11.2
portrait 11
colleagues 10.7
elderly 10.5
together 10.5
teamwork 10.2
casual 10.2
indoor 10
color 10
businesswoman 10
old 9.8
mid adult 9.6
lab coat 9.3
to 8.9
caring 8.8
looking 8.8
40s 8.8
middle aged 8.8
laboratory 8.7
clothing 8.7
education 8.7
businesspeople 8.5
mother 8.5
establishment 8.4
inside 8.3
20s 8.2
aged 8.2
dress 8.1
life 8
interior 8
day 7.8
50s 7.8
casual clothing 7.8
standing 7.8
face 7.8
lab 7.8
affectionate 7.7
corporate 7.7
attractive 7.7
husband 7.6
desk 7.6
meeting 7.5
instrument 7.5
manager 7.5
holding 7.4
treatment 7.4
uniform 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.9
clothing 88
person 82.3
woman 75.9
old 69.5
posing 57.4
man 52.8
smile 52
clothes 26.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 98.7%
Calm 47.4%
Confused 42.8%
Sad 5.4%
Happy 2.1%
Surprised 1%
Disgusted 0.6%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 30-40
Gender Male, 97.9%
Happy 88.3%
Surprised 6.6%
Calm 2.5%
Sad 0.9%
Disgusted 0.5%
Confused 0.4%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 45-53
Gender Male, 99.6%
Happy 62.6%
Sad 15.6%
Calm 7.8%
Surprised 5.8%
Confused 4%
Disgusted 2%
Angry 1.2%
Fear 1.1%

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Sad 33.7%
Happy 30.1%
Calm 29.2%
Confused 2.3%
Disgusted 1.6%
Surprised 1.1%
Angry 1%
Fear 1%

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 57.1%
Surprised 19.4%
Happy 13.9%
Sad 3.3%
Confused 2.9%
Disgusted 1.9%
Angry 0.8%
Fear 0.6%

AWS Rekognition

Age 41-49
Gender Male, 99.3%
Calm 60.9%
Angry 13.1%
Happy 12.7%
Surprised 8.6%
Sad 2%
Fear 1.2%
Confused 1%
Disgusted 0.5%

AWS Rekognition

Age 48-56
Gender Male, 99.7%
Happy 56.3%
Surprised 36.4%
Confused 3.3%
Sad 1%
Fear 0.8%
Disgusted 0.8%
Calm 0.8%
Angry 0.6%

AWS Rekognition

Age 25-35
Gender Male, 91.7%
Happy 99.8%
Surprised 0.1%
Calm 0%
Angry 0%
Disgusted 0%
Sad 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 99.8%
Calm 88.3%
Sad 6.9%
Happy 2.3%
Fear 1.3%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 93.2%
Sunglasses 65.3%

Categories

Text analysis

Amazon

a
MJI7
L a a
L
MJI7 YE37AS
YE37AS
r

Google

MJ17 YT33A2 A
MJ17
YT33A2
A