Human Generated Data

Title

Untitled (Christmas morning, St. Louis, Missouri)

Date

1948, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.56

Human Generated Data

Title

Untitled (Christmas morning, St. Louis, Missouri)

People

Artist: Martin Schweig, American 20th century

Date

1948, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.56

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.6
Person 97.8
Person 97.8
Person 97.5
People 97
Person 96.7
Person 93.2
Family 92.3
Person 91.4
Person 90.1
Shoe 86.6
Footwear 86.6
Clothing 86.6
Apparel 86.6
Face 84
Wheel 74.7
Machine 74.7
Female 70.4
Person 64.4
Couch 64.4
Furniture 64.4
Kid 62.4
Child 62.4
Photography 61.9
Photo 61.9
Transportation 60.2
Girl 59.3
Meal 59
Food 59
Portrait 58.9
Vehicle 57.6
Car 57.6
Automobile 57.6
Workshop 55.4

Clarifai
created on 2023-10-25

people 100
group 99.2
group together 98.7
adult 98.5
vehicle 98.5
woman 98.1
man 97.5
monochrome 95.9
transportation system 94.5
leader 92.6
offspring 92.2
child 92.1
several 91.9
three 91
administration 90.6
many 90.4
four 89.7
car 88.5
recreation 87.7
war 84.8

Imagga
created on 2022-01-08

kin 81
man 35.6
family 33.8
people 33.5
happy 31.3
mother 31.2
male 30.7
adult 30
person 27.4
golf equipment 25.3
motor vehicle 25.3
smiling 25.3
couple 24.4
child 23.4
sitting 23.2
happiness 21.9
daughter 21.9
smile 20.7
father 20.2
together 20.1
love 19.7
lifestyle 19.5
casual 19.5
home 19.1
sports equipment 19
portrait 18.8
parent 18.1
women 17.4
wheeled vehicle 16.4
joy 15.9
equipment 15.3
senior 15
group 14.5
couch 14.5
looking 14.4
room 14.3
fun 14.2
kid 14.2
indoors 14.1
cute 13.6
sofa 13.4
attractive 13.3
boy 13
son 12.9
men 12.9
patient 12.8
business 12.8
outdoors 12.7
husband 12.4
care 12.3
togetherness 12.3
cheerful 12.2
vehicle 11.7
professional 11.4
playing 10.9
childhood 10.7
retired 10.7
job 10.6
wife 10.4
youth 10.2
book 10.1
laptop 10
student 10
office 9.6
enjoying 9.5
clothing 9.5
sit 9.5
work 9.4
house 9.2
teen 9.2
school 9.2
outdoor 9.2
20s 9.2
leisure 9.1
teenager 9.1
park 9.1
grandfather 8.9
working 8.8
little 8.8
medical 8.8
computer 8.8
affection 8.7
married 8.6
elderly 8.6
play 8.6
jeans 8.6
illness 8.6
wheelchair 8.6
outside 8.6
talking 8.6
chair 8.5
friends 8.5
friendship 8.4
occupation 8.2
girls 8.2
businesswoman 8.2
worker 8
caring 7.9
education 7.8
30s 7.7
old 7.7
health 7.6
baby 7.6
reading 7.6
college 7.6
living 7.6
dad 7.6
kids 7.5
relaxed 7.5
relationship 7.5
mature 7.4
technology 7.4
car 7.4
life 7.3
children 7.3
course 7.3
team 7.2
handsome 7.1
bench 7.1
interior 7.1
businessman 7.1
day 7.1

Google
created on 2022-01-08

Tire 94.3
Vehicle 93.5
Car 89.7
Motor vehicle 89.4
Wheel 87.7
Suit 83.5
Picture frame 82.6
Plant 82.5
Toddler 76.7
Kit car 76.4
Snapshot 74.3
Vintage clothing 72.9
Classic 72.3
Classic car 71.8
Family reunion 64.6
History 64.6
Family car 64.4
Antique car 62.4
Sitting 61
Vintage car 58.1

Microsoft
created on 2022-01-08

text 99
person 98.6
clothing 95.7
posing 84.4
human face 80.3
man 74.8
people 68.6
group 66.3
smile 65.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 95.5%
Calm 96.3%
Happy 1.1%
Sad 0.9%
Confused 0.7%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 0-4
Gender Male, 80.3%
Calm 74.2%
Happy 23.3%
Confused 0.7%
Sad 0.5%
Surprised 0.4%
Fear 0.3%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 98.8%
Surprised 0.5%
Confused 0.2%
Calm 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Happy 97.9%
Surprised 0.8%
Sad 0.4%
Calm 0.3%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 6-12
Gender Female, 99.9%
Happy 99.5%
Surprised 0.2%
Disgusted 0.1%
Calm 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Sad 0%

AWS Rekognition

Age 41-49
Gender Male, 99.4%
Happy 86.9%
Fear 6.9%
Sad 2.1%
Surprised 1.2%
Angry 0.8%
Disgusted 0.8%
Confused 0.7%
Calm 0.6%

AWS Rekognition

Age 36-44
Gender Male, 51.4%
Happy 96.3%
Confused 1%
Disgusted 0.6%
Surprised 0.6%
Fear 0.5%
Angry 0.4%
Sad 0.3%
Calm 0.3%

AWS Rekognition

Age 45-53
Gender Female, 100%
Happy 79.9%
Surprised 7.9%
Calm 3.2%
Angry 2.6%
Disgusted 2.6%
Fear 1.6%
Sad 1.1%
Confused 1%

AWS Rekognition

Age 52-60
Gender Male, 100%
Happy 83.2%
Calm 5.8%
Surprised 4.7%
Fear 1.5%
Disgusted 1.5%
Angry 1.4%
Sad 1.2%
Confused 0.8%

AWS Rekognition

Age 2-8
Gender Female, 100%
Calm 68.3%
Surprised 16.3%
Sad 10%
Happy 1.5%
Fear 1.3%
Angry 1.1%
Confused 1%
Disgusted 0.6%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 4
Gender Male

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 51
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 86.6%
Wheel 74.7%

Categories

Text analysis

Google

しつ