Human Generated Data

Title

Untitled (man seated on window ledge, snowy street outside)

Date

c. 1950, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.182

Human Generated Data

Title

Untitled (man seated on window ledge, snowy street outside)

People

Artist: Jack Gould, American

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Car 98.5
Automobile 98.5
Transportation 98.5
Vehicle 98.5
Human 96
Person 96
Car 95.1
Outdoors 93
Nature 93
Snow 82.3
Ice 81.8
Apparel 63.6
Clothing 63.6
Overcoat 56.7
Coat 56.7
Winter 55.6
Door 55.4
Person 49.6

Imagga
created on 2021-12-14

building 29.7
architecture 26.8
house 25.1
winter 24.7
travel 24.6
snow 23.7
structure 23.5
gas pump 22.8
pump 21
sky 19.8
city 19.1
wheeled vehicle 18
home 17.6
shop 17.4
barbershop 17.3
street 16.6
office 16.4
old 15.3
tourism 14.9
car 14.2
mechanical device 14.1
vehicle 13.6
urban 13.1
window 13
tourist 12.8
transportation 12.6
wood 12.5
ocean 12.4
mercantile establishment 12.4
roof 12.4
conveyance 12.4
mobile home 12.3
town 12.1
construction 12
road 11.7
cold 11.2
light 10.7
housing 10.7
water 10.7
sea 10.2
trailer 10
tree 10
history 9.8
station 9.8
boat 9.7
wall 9.6
residential 9.6
mechanism 9.5
brick 9.4
tramway 9.4
destination 9.4
outdoor 9.2
transport 9.1
modern 9.1
landscape 8.9
new 8.9
trees 8.9
ancient 8.6
weather 8.6
empty 8.6
glass 8.6
buildings 8.5
place of business 8.4
exterior 8.3
outdoors 8.2
vacation 8.2
night 8
wooden 7.9
restaurant 7.9
day 7.8
season 7.8
door 7.8
luxury 7.7
industry 7.7
room 7.6
traditional 7.5
passenger car 7.5
historic 7.3
resort 7.2
rural 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

street 95.4
text 93.7
black and white 88.7
vehicle 82
car 75.4
land vehicle 74.4
city 74
house 71.1
monochrome 60.4
old 42.6

Face analysis

Amazon

Google

AWS Rekognition

Age 46-64
Gender Male, 76.9%
Calm 83.7%
Surprised 12.8%
Confused 2.6%
Angry 0.3%
Sad 0.2%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Car 98.5%
Person 96%

Captions

Microsoft

a vintage photo of a person 84.8%
a vintage photo of a person 81.2%
a vintage photo of a person in a room 81.1%