Human Generated Data

Title

Untitled (woman seated in car looking at blueprints held by a man)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4683

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated in car looking at blueprints held by a man)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 99
Apparel 99
Person 98.5
Human 98.5
Face 98.2
Person 97.8
Water 96.5
Furniture 93.6
Chair 93.6
Outdoors 82.7
Photo 80.2
Photography 80.2
Portrait 80.2
Waterfront 79.5
Female 79.4
Transportation 78.8
Watercraft 78.8
Vessel 78.8
Vehicle 78.8
Plant 78.7
Nature 77.9
Pants 74.8
Boat 68.9
Accessory 66.7
Glasses 66.7
Accessories 66.7
Port 66.6
Dock 66.6
Pier 66.6
Head 63.6
Text 61.4
Blossom 61
Flower 61
Wedding 60.2
Bridegroom 60.2
Vase 60.1
Potted Plant 60.1
Pottery 60.1
Jar 60.1
Poster 59.8
Advertisement 59.8
Man 59.7
Collage 59.2
Woman 58.3
Fashion 58
Robe 58
Table 58
Building 57.8
Porch 57.3
Barge 55.3

Imagga
created on 2022-02-05

monitor 30.9
technology 23.7
man 21.5
equipment 21.1
computer 21
business 20.6
work 20.4
people 19.5
electronic equipment 19.2
male 19.1
television 18.4
newspaper 17.4
person 17
businessman 15.9
billboard 15.3
digital 14.6
cockpit 14.5
job 14.1
hand 13.7
working 13.2
product 12.7
worker 12.4
signboard 11.8
3d 11.6
education 11.3
office 11.1
effects 10.4
adult 10.4
design 10.1
engineer 10.1
creation 9.9
screen 9.8
human 9.7
chart 9.6
professional 9.5
three dimensional 9.3
finance 9.3
board 9.2
film 9.2
occupation 9.2
data 9.1
student 9.1
success 8.8
render 8.6
men 8.6
imagination 8.5
modern 8.4
building 8.3
room 8.3
graphics 8.2
laptop 8.2
industrial 8.2
structure 8.2
blackboard 8.1
negative 8.1
symbol 8.1
science 8
information 8
classroom 8
display 7.9
teacher 7.8
hands 7.8
stage 7.8
construction 7.7
happy 7.5
manager 7.4
businesswoman 7.3
black 7.2
looking 7.2
telecommunication system 7.2
bright 7.1
financial 7.1
portrait 7.1
medical 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
black and white 94.9
person 92
water 57.9
sky 52.2

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Male, 73.8%
Happy 58.9%
Calm 35.7%
Surprised 1.9%
Sad 1.2%
Angry 0.7%
Fear 0.6%
Disgusted 0.6%
Confused 0.3%

Feature analysis

Amazon

Person 98.5%
Boat 68.9%

Captions

Microsoft

a person standing in front of a box 49.3%

Text analysis

Amazon

22
39461-A.

Google

39461-A.
39461-A. YT37A°2-XA
YT37A°2-XA