Human Generated Data

Title

Untitled (women seated on hood of truck)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7098

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women seated on hood of truck)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7098

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Shorts 99.6
Clothing 99.6
Apparel 99.6
Person 99.6
Person 99.3
Person 98.4
Car 98.2
Automobile 98.2
Vehicle 98.2
Transportation 98.2
Car 97.4
Person 89
Shoe 82.9
Footwear 82.9
People 72.7
Leisure Activities 63.6
Wheel 63.1
Machine 63.1
Face 60.9
Text 59.4
Drawing 57.1
Art 57.1
Kid 57
Child 57
Girl 56.8
Female 56.8

Clarifai
created on 2023-10-15

people 99
man 96.1
woman 95.3
adult 94.7
fun 87.6
child 86.2
monochrome 85.7
group 84.4
group together 84.1
young 81.1
girl 80.2
summer 79.8
desktop 79.5
business 79.1
many 78.9
squad 78.2
boy 77.2
communication 73.7
leisure 73.3
couple 72.3

Imagga
created on 2021-12-15

automaton 24.1
iron lung 21.8
respirator 17.4
man 14.8
art 14.4
3d 13.9
breathing device 13.1
design 12.9
pattern 10.3
device 10.2
human 9.7
people 9.5
black 9
body 8.8
shape 8.8
graphic 8.7
business 8.5
decoration 8.4
silhouette 8.3
drawing 8.1
cartoon 8
person 8
cemetery 7.8
male 7.8
motion 7.7
health 7.6
dollar 7.4
sport 7.4
symbol 7.4
speed 7.3
backgrounds 7.3
currency 7.2
team 7.2
transparent 7.2
wreckage 7.1
science 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 90.1
clothing 88.3
outdoor 88.2
person 73.3
black and white 72.5
footwear 62.3
posing 41.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-44
Gender Female, 80%
Calm 88.9%
Happy 5.5%
Sad 2.2%
Surprised 1.7%
Fear 0.7%
Disgusted 0.4%
Confused 0.3%
Angry 0.2%

AWS Rekognition

Age 49-67
Gender Female, 73.7%
Calm 36%
Sad 27.5%
Fear 14%
Angry 10.3%
Happy 6.2%
Disgusted 2.9%
Surprised 2%
Confused 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Car 98.2%
Shoe 82.9%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

16096
6
a
94
B77

Google

16096. B7 16096. NAGON-YT3RA2-NAMTZA3
16096.
B7
NAGON-YT3RA2-NAMTZA3