Human Generated Data

Title

Untitled (people eating outside of Burger Queen)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7175

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people eating outside of Burger Queen)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 95.5
Human 95.5
Building 91.3
Person 89.7
Person 85.5
Nature 81
Architecture 76.6
Chair 75.9
Furniture 75.9
Outdoors 75.7
City 75.1
Town 75.1
Urban 75.1
Person 71
Downtown 65.9
Person 62.3
Restaurant 60.6
Hotel 59.7
Person 57
Cafeteria 55.5
Person 52.3

Imagga
created on 2022-01-08

architecture 32.8
building 31.6
sky 23.6
house 23.4
resort 23.4
city 19.1
structure 18.4
modern 18.2
landscape 17.9
home 17.1
travel 16.9
construction 16.3
beach 15.1
water 14
urban 14
vacation 13.9
scene 13.9
exterior 13.8
window 13.7
tropical 13.6
holiday 13.6
sea 13.5
trees 12.5
ocean 12.4
hall 12.4
snow 12
office 11.9
design 11.8
summer 11.6
business 11.5
outdoor 11.5
bay 11.3
outdoors 11.2
road 10.8
sand 10.6
palm 10.3
industry 10.3
residence 10
tree 10
sun 9.7
grass 9.5
paradise 9.4
day 9.4
winter 9.4
ski slope 9.3
town 9.3
tourism 9.1
scenic 8.8
apartment 8.6
expensive 8.6
luxury 8.6
estate 8.6
facility 8.5
bridge 8.5
buildings 8.5
balcony 8.5
clouds 8.5
island 8.2
new 8.1
river 8
night 8
interior 8
door 7.8
cloud 7.8
property 7.7
garage 7.7
station 7.7
relax 7.6
cityscape 7.6
relaxation 7.5
floor 7.4
slope 7.4
negative 7.4
street 7.4
light 7.4
transport 7.3
supermarket 7.3
coast 7.2
glass 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 95.3
black and white 78.8
sky 78.3
fog 73.5
building 51.2
shore 15.2

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 70.5%
Calm 58.9%
Sad 13.8%
Confused 10.8%
Happy 8%
Angry 3.1%
Fear 1.9%
Disgusted 1.8%
Surprised 1.8%

Feature analysis

Amazon

Person 95.5%

Captions

Microsoft

a person standing in front of a window 39.6%
an old photo of a person 39.5%
a person standing in front of a window 39.4%

Text analysis

Amazon

OPEN
SU
2
OZ