Human Generated Data

Title

Untitled (child, parents, and grandparents seated in yard)

Date

c. 1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1998

Human Generated Data

Title

Untitled (child, parents, and grandparents seated in yard)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1998

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Person 99.4
Person 99.1
Person 99.1
Person 98.9
Dress 95.1
Suit 94.9
Overcoat 94.9
Coat 94.9
Chair 88.7
Furniture 88.7
Face 87.1
People 85.3
Female 82.9
Robe 80.6
Fashion 80.6
Gown 79.4
Wedding 73.6
Person 70.9
Grass 69.7
Plant 69.7
Outdoors 69.5
Portrait 69.4
Photography 69.4
Photo 69.4
Kid 65.7
Child 65.7
Bridegroom 65.7
Wedding Gown 65.7
Woman 65.1
Shorts 64
Girl 61.5
Field 60.6
Man 60
Shirt 58.8
Tuxedo 58.5
Nature 57.6
Smile 57.1
Yard 55

Clarifai
created on 2023-10-26

people 100
adult 98.6
group 98.2
man 96.8
group together 95.6
woman 95.1
leader 93.6
wear 93
nostalgia 92.6
many 90.5
outfit 90.3
chair 89.1
child 88.5
music 87.6
portrait 84.7
musician 82.5
administration 81.4
monochrome 79.2
several 78.8
recreation 78.2

Imagga
created on 2021-12-14

negative 30.5
film 26.5
grunge 18.7
photographic paper 17.5
black 16.2
musical instrument 14.8
silhouette 14.1
old 13.9
people 13.4
man 12.8
style 12.6
vintage 12.4
photographic equipment 11.7
art 11.6
structure 11.2
person 11
male 10.6
wind instrument 10.5
accordion 10.2
dark 10
outdoor 9.9
antique 9.5
play 9.5
fountain 9.1
sunset 9
keyboard instrument 8.5
texture 8.3
dirty 8.1
child 8.1
water 8
night 8
men 7.7
power 7.6
sport 7.5
leisure 7.5
light 7.4
performer 7.3
lady 7.3
paint 7.2
aged 7.2
kin 7.2
player 7.1
summer 7.1
sky 7

Google
created on 2021-12-14

Plant 88.3
Dress 84.5
Picture frame 82.3
Adaptation 79.2
Tree 78.8
Tints and shades 77.3
Suit 76.6
Grass 76.5
Vintage clothing 73.3
Monochrome 72.4
Chair 71.7
Monochrome photography 70.5
Classic 70.3
Event 69.5
Photo caption 64.8
Stock photography 64.5
Room 64.2
History 62.6
Art 61.6
Sitting 59.7

Microsoft
created on 2021-12-14

text 97.8
person 93.7
clothing 89.2
posing 80.4
player 76.7
man 67.6
old 53.8
drawing 52.5
team 40.5
vintage 34

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 94.5%
Calm 89.7%
Sad 4.2%
Happy 2.3%
Fear 1.4%
Surprised 0.9%
Angry 0.6%
Confused 0.6%
Disgusted 0.3%

AWS Rekognition

Age 50-68
Gender Male, 97.7%
Calm 97.5%
Happy 1.3%
Surprised 0.4%
Sad 0.3%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Male, 95.1%
Calm 85.9%
Sad 12.5%
Surprised 0.4%
Confused 0.4%
Fear 0.4%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Female, 51.9%
Calm 99.5%
Sad 0.2%
Surprised 0.1%
Happy 0.1%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-35
Gender Male, 53%
Calm 96.9%
Surprised 1.2%
Happy 0.6%
Confused 0.5%
Sad 0.4%
Angry 0.4%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 17-29
Gender Female, 68.5%
Calm 65.9%
Happy 19.9%
Sad 7.7%
Angry 3.3%
Disgusted 1%
Surprised 0.9%
Confused 0.8%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 96.7%
interior objects 2.6%