Human Generated Data

Title

Untitled (modeling swimsuits)

Date

c. 1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1870

Human Generated Data

Title

Untitled (modeling swimsuits)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1870

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.5
Person 99.2
Person 99.1
Person 99.1
Person 99.1
Person 98.9
Person 98.7
Person 97.6
Clothing 97.2
Apparel 97.2
Person 97
Person 95.2
Person 91.7
Person 91.3
Person 91.1
People 87.2
Person 86.1
Person 84.7
Poster 84
Advertisement 84
Shorts 83.1
Person 79.4
Female 75.6
Suit 73.1
Coat 73.1
Overcoat 73.1
Face 71.1
Outdoors 67.2
Crowd 67
Robe 64.6
Fashion 64.6
Person 63.9
Person 63.3
Girl 61.9
Gown 61.1
Wedding 57.7
Tree 57.7
Plant 57.7
Nature 56.1
Woman 55.9

Clarifai
created on 2023-10-25

people 99.9
child 98.8
many 98.1
adult 97.8
group 96.1
boy 95.4
man 95
group together 94.8
woman 92.3
crowd 91.6
monochrome 88.7
wear 88.2
military 82.1
education 81.5
uniform 80.3
black and white 80.2
war 77.9
art 77.8
school 76.8
desktop 76.5

Imagga
created on 2021-12-14

cemetery 26.4
picket fence 26.3
fence 24
structure 23.3
negative 19.5
landscape 18.6
grunge 17
snow 16.7
barrier 16.5
film 15.2
silhouette 14.9
old 14.6
fountain 14.6
black 14.4
art 14
winter 13.6
sky 12.8
texture 12.5
park 12.3
tree 12.3
scene 12.1
dirty 11.7
dark 11.7
stone 11.6
vintage 11.6
grungy 11.4
light 11.4
water 11.3
forest 11.3
ice 11.2
obstruction 11.1
pattern 10.9
travel 10.6
decoration 10.5
sun 10.5
antique 10.4
cold 10.3
people 10
outdoor 9.9
country 9.7
gravestone 9.6
photographic paper 9.5
man 9.4
graffito 9.4
field 9.2
retro 9
design 9
sunset 9
trees 8.9
wall 8.7
crowd 8.6
summer 8.4
frame 8.3
lake 8.2
rough 8.2
paint 8.1
brown 8.1
group 8.1
night 8
rural 7.9
rock 7.8
season 7.8
memorial 7.8
outdoors 7.5
grain 7.4
graphic 7.3
aged 7.2
color 7.2
scenery 7.2
holiday 7.2
sunlight 7.1
cool 7.1
paper 7.1
sea 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.6
person 98.3
clothing 85.5
man 85.1
old 83
black 66.5
posing 45.5
vintage 27.6
picture frame 23

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-42
Gender Male, 64.3%
Sad 95.6%
Fear 1.6%
Calm 1.5%
Confused 0.4%
Happy 0.4%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 27-43
Gender Male, 70.3%
Happy 55.2%
Calm 35.6%
Sad 3.6%
Fear 2.2%
Surprised 1.2%
Angry 1%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 22-34
Gender Female, 60.2%
Happy 69.3%
Calm 24.1%
Sad 3.4%
Angry 1.1%
Surprised 1%
Disgusted 0.5%
Confused 0.4%
Fear 0.2%

AWS Rekognition

Age 51-69
Gender Male, 82.2%
Calm 66.6%
Surprised 12.8%
Happy 10%
Angry 7.4%
Sad 1.2%
Disgusted 1.1%
Confused 0.6%
Fear 0.3%

AWS Rekognition

Age 52-70
Gender Female, 67%
Calm 65.9%
Sad 14.2%
Angry 8.9%
Confused 4.6%
Surprised 2.9%
Happy 1.6%
Fear 1.3%
Disgusted 0.5%

AWS Rekognition

Age 26-40
Gender Male, 97.4%
Calm 91.5%
Happy 6.5%
Sad 0.7%
Disgusted 0.5%
Confused 0.4%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 27-43
Gender Male, 86.8%
Calm 61.1%
Sad 11.8%
Surprised 10.9%
Happy 6.8%
Confused 4.8%
Fear 2.5%
Angry 1.5%
Disgusted 0.5%

AWS Rekognition

Age 39-57
Gender Male, 95.7%
Calm 97.4%
Happy 1.3%
Surprised 0.6%
Sad 0.4%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Female, 59.7%
Calm 46.9%
Angry 21.4%
Happy 14%
Sad 10.2%
Confused 2.5%
Disgusted 2.5%
Surprised 1.5%
Fear 1.1%

AWS Rekognition

Age 47-65
Gender Female, 82.7%
Calm 71.3%
Happy 20.6%
Sad 2.1%
Surprised 1.9%
Disgusted 1.5%
Confused 1%
Fear 0.9%
Angry 0.6%

AWS Rekognition

Age 24-38
Gender Female, 54.4%
Calm 68%
Surprised 10.2%
Sad 6.7%
Fear 6.2%
Happy 4.4%
Confused 2.3%
Angry 1.8%
Disgusted 0.4%

AWS Rekognition

Age 25-39
Gender Female, 76.2%
Calm 62.2%
Surprised 27.2%
Sad 4.8%
Angry 1.7%
Happy 1.4%
Fear 1.2%
Confused 1.1%
Disgusted 0.5%

AWS Rekognition

Age 32-48
Gender Female, 69.7%
Calm 92.5%
Sad 6.6%
Happy 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 50-68
Gender Male, 75.9%
Calm 99.4%
Happy 0.3%
Sad 0.1%
Surprised 0.1%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 24-38
Gender Female, 50.1%
Calm 34.1%
Sad 33.3%
Happy 24.3%
Fear 3.5%
Angry 2.6%
Disgusted 0.9%
Confused 0.7%
Surprised 0.7%

AWS Rekognition

Age 34-50
Gender Female, 90.8%
Sad 99.8%
Calm 0.1%
Confused 0.1%
Fear 0%
Surprised 0%
Angry 0%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 84%

Categories

Imagga

paintings art 92.2%
text visuals 7.6%