Human Generated Data

Title

Untitled (family on lawn with '25' cake)

Date

1944

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1622

Human Generated Data

Title

Untitled (family on lawn with '25' cake)

People

Artist: John Deusing, American active 1940s

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1622

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Person 99.3
Person 99.1
Person 98.9
Person 98
Clothing 89.9
Apparel 89.9
People 87.1
Face 81.7
Photography 62.5
Photo 62.5
Family 57.1
Female 55.5
Suit 55.1
Coat 55.1
Overcoat 55.1

Clarifai
created on 2023-10-15

people 99.6
group 99
man 95.3
woman 93.9
adult 93.4
wedding 91.6
monochrome 90.5
family 89.9
leader 89.2
three 88.1
child 87.3
retro 86.9
sit 86.6
groom 82.2
princess 80.7
nostalgic 80.1
interaction 79.6
nostalgia 79.2
ceremony 78.9
chair 78.7

Imagga
created on 2021-12-14

negative 50.8
film 41.3
photographic paper 29.7
photographic equipment 19.8
old 16.7
blackboard 15.6
people 15
kin 13.2
bride 12.3
grunge 11.9
religion 11.6
man 11.4
person 10.9
history 10.7
stall 10.5
adult 10.5
black 10.2
art 10.2
happiness 10.2
cemetery 10.2
male 9.9
men 9.4
vintage 9.1
couple 8.7
love 8.7
water 8.7
wedding 8.3
retro 8.2
sculpture 8.1
bouquet 8.1
color 7.8
ancient 7.8
marble 7.7
two 7.6
happy 7.5
groom 7.5
human 7.5
world 7.5
one 7.5
business 7.3
dress 7.2
statue 7.1
smile 7.1
portrait 7.1
face 7.1
architecture 7

Google
created on 2021-12-14

Black 89.6
Smile 89.3
Font 76.4
Motor vehicle 76.2
Monochrome photography 70.9
Vintage clothing 70.3
Event 70.1
Monochrome 68.8
Room 66.8
Art 66.3
Crew 63.6
Stock photography 63.6
Hat 62.6
History 62.4
Visual arts 60.8
Sitting 55.8
Classic 55.5
T-shirt 53.3
Team 51.3

Microsoft
created on 2021-12-14

window 99.5
text 99.1
old 90.5
clothing 87.6
person 87.1
drawing 86.1
posing 80.1
man 79.1
human face 77.6
sketch 76.6
group 55.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-58
Gender Female, 71.6%
Sad 46.8%
Happy 25.6%
Calm 23.4%
Confused 2.3%
Surprised 0.7%
Fear 0.5%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 41-59
Gender Female, 74.2%
Happy 46.5%
Calm 35%
Sad 11.2%
Confused 5.4%
Surprised 0.6%
Angry 0.6%
Fear 0.5%
Disgusted 0.2%

AWS Rekognition

Age 19-31
Gender Male, 64.9%
Happy 50.9%
Calm 14.3%
Sad 13.7%
Angry 13.5%
Confused 3.8%
Surprised 1.9%
Fear 1.3%
Disgusted 0.6%

AWS Rekognition

Age 41-59
Gender Male, 73.2%
Calm 49.5%
Happy 27%
Sad 20.2%
Confused 1.7%
Surprised 0.6%
Angry 0.5%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 43-61
Gender Male, 81.6%
Calm 51%
Happy 41.8%
Sad 2.5%
Confused 2%
Surprised 1.2%
Angry 1%
Fear 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

4-9050
6011
011
Z
YT3302
58
GOLL
M-113
M-113 YT3302 02200
344
THE
٠٢
02200
ة
HNE
the
1214

Google

4-9050
4-9050 4-9050