Human Generated Data

Title

Untitled (children opening presents while adults sit in chairs inside house)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9144

Human Generated Data

Title

Untitled (children opening presents while adults sit in chairs inside house)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9144

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.2
Person 99.1
Person 98.9
Person 98.3
Person 97.7
Person 97.7
Person 97.5
Room 97.2
Indoors 97.2
Person 96
Person 95
Person 93.9
Interior Design 93.4
Person 92.6
Person 88.4
Bedroom 84.2
Clothing 81.4
Apparel 81.4
Classroom 77.8
School 77.8
Female 76.4
Crowd 73.1
People 71.8
Girl 67.6
Shoe 67
Footwear 67
Furniture 64.3
Face 63
Advertisement 62.4
Poster 60.2
Housing 59.8
Building 59.8
Audience 59.5
Living Room 59.4
Person 58.2
Bed 58
Urban 57.3
Kid 55.9
Child 55.9
Dorm Room 55.8
Collage 55.7
Woman 55.7
Person 55.2

Clarifai
created on 2023-10-26

people 99.9
group 99.4
many 98.1
group together 97.8
man 96.7
adult 96.4
woman 96.4
wear 94.2
recreation 93.8
furniture 89.2
child 89.1
administration 88.7
audience 88
dancing 87
actress 86.5
monochrome 86.3
several 84
outfit 83.5
sit 82
actor 81.9

Imagga
created on 2022-01-23

salon 56.3
shop 36.4
barbershop 33.4
man 32.9
people 29.5
person 28.2
male 25.5
room 23
mercantile establishment 22.9
hairdresser 19.8
office 19.1
adult 19.1
indoors 18.4
computer 16
business 15.8
place of business 15.6
men 15.4
work 14.9
desk 14.4
home 14.3
horizontal 14.2
sitting 13.7
chair 13.5
teacher 12.4
table 12.4
interior 12.4
working 12.4
professional 12.1
happy 11.9
portrait 11.6
group 11.3
senior 11.2
indoor 10.9
lifestyle 10.8
smile 10.7
businessman 10.6
education 10.4
mature 10.2
smiling 10.1
modern 9.8
worker 9.8
job 9.7
meeting 9.4
equipment 9.4
phone 9.2
black 9
old 8.4
laptop 8.3
classroom 8.3
fashion 8.3
holding 8.2
music 8.1
family 8
looking 8
medical 7.9
women 7.9
look 7.9
establishment 7.9
couple 7.8
casual 7.6
communication 7.6
child 7.5
style 7.4
occupation 7.3
alone 7.3
seat 7.2
decoration 7.2
team 7.2
handsome 7.1
patient 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 95.8
text 95.2
clothing 80.7
clothes 42.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 98.8%
Surprised 0.8%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 87.2%
Calm 70.1%
Happy 18.2%
Sad 7.3%
Confused 2.8%
Angry 0.7%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 19-27
Gender Male, 91.3%
Sad 79.5%
Calm 15%
Happy 2.4%
Confused 2.1%
Disgusted 0.3%
Angry 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Male, 97.8%
Happy 43.6%
Disgusted 18.2%
Sad 11.2%
Calm 9.7%
Surprised 6.6%
Fear 4.3%
Angry 4.1%
Confused 2.3%

AWS Rekognition

Age 28-38
Gender Male, 73.4%
Calm 75.2%
Sad 11.9%
Angry 6.2%
Confused 2.5%
Happy 1.7%
Disgusted 1.2%
Fear 0.7%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 67%

Text analysis

Amazon

8
3
3 ٢٢
٢٢
MJIF
MJIF YT37AS ACHMA
YT37AS
ACHMA

Google

MJI7 YT3RA2
MJI7
YT3RA2