Human Generated Data

Title

Untitled (two women driving on a convertable)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7400

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women driving on a convertable)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7400

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Car 98.1
Vehicle 98.1
Automobile 98.1
Transportation 98.1
Person 96.3
Human 96.3
Person 89.3
Mirror 78.6
Road 67.3
Car Mirror 66.1
Photography 62.3
Photo 62.3
Plant 61.6
Bridegroom 58.6
Wedding 58.6
Windshield 57.9
Text 56.9
Collage 56.3
Advertisement 56.3
Poster 56.3

Clarifai
created on 2023-10-25

people 99.7
adult 98.8
man 98.4
two 97.9
woman 96.2
desktop 94.7
monochrome 94.2
car 92.4
portrait 91.9
vehicle 91.3
couple 91.1
transportation system 90.1
one 89.4
group 86.9
vehicle window 86.8
street 85.4
sit 85.1
person 81.4
three 80.3
black and white 80

Imagga
created on 2022-01-08

car mirror 50.2
mirror 39.6
windshield wiper 38.4
mechanical device 30.7
reflector 30.1
adult 25.9
television 25.7
car 25.2
person 24
mechanism 22.9
sitting 22.3
laptop 20.5
people 20.1
happy 19.4
computer 18.1
office 17.7
driver 17.5
portrait 17.5
device 17.1
business 17
pretty 16.1
vehicle 15.9
work 15.7
automobile 15.3
screen 15.1
man 14.8
smile 14.2
job 14.2
working 14.1
attractive 14
smiling 13.7
businesswoman 13.6
transportation 13.4
desk 13.2
casual 12.7
professional 12.7
women 12.7
drive 12.3
lifestyle 12.3
technology 11.9
windshield 11.8
face 11.4
lady 11.4
telecommunication system 11.4
outdoors 11.2
hair 11.1
outdoor 10.7
male 10.6
one 10.4
looking 10.4
corporate 10.3
windowsill 10
driving 9.7
auto 9.6
wheel 9.4
model 9.3
cute 9.3
executive 9.2
modern 9.1
student 9.1
worker 8.9
support 8.8
brunette 8.7
career 8.5
black 8.4
horizontal 8.4
notebook 8.1
road 8.1
cheerful 8.1
protective covering 8
sill 8
businessman 7.9
happiness 7.8
communication 7.6
contemporary 7.5
joy 7.5
human 7.5
phone 7.4
20s 7.3
broadcasting 7.3
success 7.2
color 7.2
squeegee 7.2
day 7.1
sunglasses 7
travel 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.4
human face 87.9
black and white 85.3
person 71
tree 53.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 52.7%
Happy 41.3%
Disgusted 20.9%
Calm 15.8%
Angry 13%
Confused 3.6%
Sad 2.4%
Surprised 2.3%
Fear 0.8%

AWS Rekognition

Age 16-24
Gender Female, 53.2%
Sad 91.9%
Angry 2.4%
Calm 2.1%
Fear 1.7%
Confused 0.8%
Surprised 0.5%
Disgusted 0.4%
Happy 0.2%

Feature analysis

Amazon

Car 98.1%
Person 96.3%

Categories

Captions

Microsoft
created on 2022-01-08

graphical user interface, website 92.1%

Text analysis

Amazon

15936.
NAGOY
NAGOY MAMTAAT
MAMTAAT

Google

15936- 13936.
15936-
13936.