Human Generated Data

Title

Untitled (family sewing suspenders)

Date

c. 1930

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.809

Human Generated Data

Title

Untitled (family sewing suspenders)

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.809

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.3
Person 99.3
Person 99.3
Person 95.6
Person 94.6
Sitting 81.1
Indoors 71.5
Room 70.5
Leisure Activities 68.6
Musician 67.7
Musical Instrument 67.7
Painting 66.8
Art 66.8
Collage 59.9
Poster 59.9
Advertisement 59.9
Photo 58.6
Portrait 58.6
Face 58.6
Photography 58.6
Drawing 57.8
Person 57.7
Flooring 57.7
Guitar 57.4
Workshop 57.2
Furniture 57.1
Studio 56.1
Person 55.6

Clarifai
created on 2023-10-25

people 99.8
group 98.8
woman 96
family 95.4
group together 95.3
collage 94.5
child 94.3
man 93.8
adult 91.5
monochrome 91
room 89.2
education 88.2
music 87.6
portrait 85.8
three 85.6
furniture 85.4
school 83.8
vintage 80.6
art 77.8
desk 77.2

Imagga
created on 2021-12-14

people 17.3
monitor 17.3
equipment 17
man 16.8
silhouette 16.5
window 16.5
television 16.3
barbershop 15.6
design 15.2
black 14.4
person 13.6
art 13.4
shop 12.7
drawing 12.6
male 11.3
men 11.2
grunge 11.1
salon 10.5
mercantile establishment 10.1
office 9.9
retro 9
cartoon 8.9
women 8.7
frame 8.6
house 8.4
backboard 8.2
technology 8.2
music 8.1
graphic 8
light 8
home 8
electronic equipment 8
business 7.9
boy 7.8
reflection 7.8
play 7.8
modern 7.7
poster 7.6
computer 7.5
city 7.5
sketch 7.4
decoration 7.2
transportation 7.2
family 7.1
night 7.1
working 7.1
businessman 7.1

Microsoft
created on 2021-12-14

gallery 97.3
text 93.2
person 92.9
room 87.1
clothing 78.7
scene 72.6
black and white 72.2
posing 41.1
picture frame 9.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 44-62
Gender Female, 86.1%
Sad 96.8%
Calm 2.9%
Happy 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 5-15
Gender Female, 67.8%
Calm 79.6%
Confused 7.8%
Surprised 5.4%
Happy 3.9%
Angry 1.2%
Disgusted 0.9%
Sad 0.7%
Fear 0.5%

AWS Rekognition

Age 22-34
Gender Female, 97.5%
Calm 56.7%
Sad 41%
Confused 0.7%
Fear 0.5%
Happy 0.4%
Angry 0.4%
Surprised 0.2%
Disgusted 0%

AWS Rekognition

Age 21-33
Gender Male, 81.8%
Sad 97.5%
Calm 1%
Fear 0.7%
Angry 0.3%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 16-28
Gender Female, 96.3%
Sad 99.9%
Calm 0.1%
Fear 0%
Happy 0%
Confused 0%
Angry 0%
Surprised 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Painting 66.8%

Categories