Human Generated Data

Title

Untitled (woman having her hair done in moblie home park)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.188

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman having her hair done in moblie home park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.188

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Chair 99.6
Furniture 99.6
Person 99
Human 99
Person 98.4
Person 85.5
Shorts 78.4
Clothing 78.4
Apparel 78.4
Person 72.9
Person 66.9
Sitting 66.2
People 64.7
Shoe 64.7
Footwear 64.7
Ground 58.1
Advertisement 57.4
Spoke 55.3
Machine 55.3
Poster 55.3
Face 55.1

Clarifai
created on 2023-10-25

people 100
two 99.6
man 99.3
three 98.3
group together 98.2
group 97.6
adult 97.1
chair 96.8
woman 96
recreation 95
seat 93.4
furniture 93.2
elderly 93.2
vehicle 92.9
child 92.6
family 92.3
retro 91.9
transportation system 91.6
four 91.5
sit 90.8

Imagga
created on 2022-01-08

tricycle 67.8
wheeled vehicle 60.8
vehicle 48.5
wheelchair 45.7
chair 32.4
conveyance 28.5
man 26.2
people 24.5
seat 24.2
outdoors 22.5
male 21.4
adult 18.1
person 17.4
disabled 15.8
care 15.6
old 15.3
wheel 15.1
sport 14.3
help 14
outdoor 13.8
lifestyle 13.7
transportation 13.4
black 13.2
outside 12.8
active 12.6
senior 12.2
men 12
health 11.8
summer 11.6
love 11
park 10.7
furniture 10.5
illness 10.5
day 10.2
happy 10
pedestrian 10
disability 9.9
activity 9.8
mobility 9.8
retired 9.7
sick 9.7
beach 9.4
city 9.1
leisure 9.1
human 9
cart 8.9
family 8.9
handicapped 8.9
handicap 8.9
medical 8.8
women 8.7
smiling 8.7
snow 8.6
elderly 8.6
sitting 8.6
husband 8.6
wife 8.5
walking 8.5
parent 8.4
support 8.4
sky 8.3
street 8.3
mother 8.2
invalid 7.9
urban 7.9
couple 7.8
portrait 7.8
ride 7.8
travel 7.7
retirement 7.7
walk 7.6
one 7.5
transport 7.3
jinrikisha 7.3
road 7.2
armchair 7.1
bicycle 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.4
person 96.8
clothing 95.4
outdoor 90.1
black and white 84.4
man 81.4
vehicle 81.1
wheel 80
land vehicle 75.2
tire 57.3
posing 35.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 92.7%
Happy 99.6%
Calm 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0%
Sad 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Shoe 64.7%

Categories

Imagga

paintings art 99.9%