Human Generated Data

Title

Untitled (woman styling seated woman's hair)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7579

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman styling seated woman's hair)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7579

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 100
Chair 100
Person 97.3
Human 97.3
Person 96
Clothing 90.2
Apparel 90.2
Worker 90
Face 83.9
Shoe 76.9
Footwear 76.9
Hairdresser 75.6
Person 74.1
Person 73.4
Meal 72.6
Food 72.6
Photography 70.7
Photo 70.7
Female 70.5
Portrait 70.2
Yard 65.9
Nature 65.9
Outdoors 65.9
Building 62.9
Housing 62.9
Plant 61.7
Sitting 57.2
Villa 55.9
House 55.9
People 55.8
Shoe 53.1

Clarifai
created on 2023-10-25

people 99.9
adult 99.1
group together 99.1
monochrome 98.4
furniture 97
group 96.5
two 96
child 95.5
recreation 95.1
woman 94.4
three 94.3
man 93.5
seat 93.5
several 91.7
war 91.5
administration 91.2
wear 90.3
actor 89.8
sitting 89.2
chair 89.1

Imagga
created on 2022-01-08

tricycle 36.8
wheeled vehicle 30.4
vehicle 23.1
conveyance 19.4
comic book 17.4
person 17
people 16.2
newspaper 15.2
man 12.8
product 12.8
sky 12.7
umbrella 12.7
outdoor 11.5
building 11.3
sport 11.1
adult 10.3
active 9.9
snow 9.8
sexy 9.6
black 9.6
body 9.6
men 9.4
creation 9.2
city 9.1
art 9.1
old 9
fashion 9
lifestyle 8.7
costume 8.6
winter 8.5
tool 8.3
human 8.2
water 8
celebration 8
canopy 7.9
device 7.9
architecture 7.8
portrait 7.8
electric chair 7.7
war 7.7
modern 7.7
outside 7.7
statue 7.6
decoration 7.5
design 7.3
exercise 7.3
activity 7.2
holiday 7.2
history 7.1
work 7.1
women 7.1
male 7.1
travel 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.7
outdoor 97.4
person 86.2
black and white 80.2
posing 63.2
statue 51

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Shoe 76.9%

Categories

Text analysis

Amazon

31134
NAGOY
ECC