Human Generated Data

Title

Ruby, Daughter of a Coal Miner from Tennessee, American River Camp, near sacramento, California, November 1936

Date

1936

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.711

Human Generated Data

Title

Ruby, Daughter of a Coal Miner from Tennessee, American River Camp, near sacramento, California, November 1936

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Sedgewick Memorial Collection, 2.2002.711

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Housing 98.5
Building 98.5
Furniture 97
Sitting 91
Bed 84.1
Indoors 81.5
Face 74.5
Interior Design 70.1
Photography 66
Photo 66
Portrait 66
Clothing 62.7
Apparel 62.7
Bedroom 57.4
Room 57.4
Attic 55.3
Loft 55.3

Clarifai
created on 2023-10-27

people 100
adult 99.5
man 98
two 97.7
one 97.6
portrait 97.5
street 97.2
woman 96.7
war 95.3
child 94.6
administration 91.1
monochrome 90.7
wear 89.2
group together 89
vehicle 87.8
family 87.6
soldier 86.9
three 86.8
boy 85.6
furniture 83.2

Imagga
created on 2022-01-22

passenger 48.8
man 19.5
people 17.8
person 17.6
male 17
adult 15.5
old 15.3
barbershop 14.7
building 13.9
car 12.8
window 12.6
shop 12.5
door 12.5
black 12
automobile 11.5
vehicle 11.2
work 11
iron 10.9
dirty 10.8
transportation 10.7
happy 10.6
one 10.4
portrait 10.3
architecture 10.1
inside 10.1
industrial 10
smile 10
vintage 9.9
wall 9.4
lifestyle 9.4
mercantile establishment 9.2
house 9.2
human 9
worker 9
stretcher 8.8
home 8.8
urban 8.7
auto 8.6
construction 8.5
business 8.5
future 8.4
street 8.3
back 8.3
alone 8.2
cell 8.1
light 8
steel 7.9
driver 7.8
industry 7.7
grunge 7.7
casual 7.6
city 7.5
safety 7.4
chair 7.3
transport 7.3
sensuality 7.3
looking 7.2
history 7.2
face 7.1
working 7.1
travel 7
indoors 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 95.9
clothing 92.3
person 92.3
drawing 84.1
black and white 77.7
man 72.1
sketch 62.6
old 52.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 100%
Fear 98.7%
Surprised 0.7%
Calm 0.2%
Sad 0.2%
Happy 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Bed 84.1%

Categories

Imagga

paintings art 52.5%
interior objects 44.6%
food drinks 1.4%

Captions