Human Generated Data

Title

Untitled (overview of fair)

Date

1949

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2032

Human Generated Data

Title

Untitled (overview of fair)

People

Artist: Hamblin Studio, American active 1930s

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Theme Park 98.9
Amusement Park 98.9
Person 98.4
Person 97.2
Tent 95.5
Person 94.3
Tent 88.7
Person 87.2
Person 86
Tent 84.9
Person 80
Person 77.8
Crowd 74.3
Person 73.8
Person 71
Person 62.2
Carousel 59.7

Imagga
created on 2021-12-14

building 35.5
architecture 31.3
carousel 22.4
sky 21
travel 20.4
hall 19.1
ride 18.5
structure 18.4
city 18.3
canvas tent 18.1
umbrella 17.7
house 16.9
sea 16.6
water 16
ocean 15.9
restaurant 15.6
modern 15.4
resort 15.4
night 15.1
tourism 14.8
beach 14
construction 13.7
mechanical device 13.5
chair 13.5
table 13.2
summer 12.9
interior 11.5
window 11.5
vacation 11.4
sun 11.3
landscape 11.1
luxury 11.1
bay 11
business 10.9
tourist 10.9
coast 10.8
bridge 10.5
roof 10.3
mechanism 10.3
island 10.1
chairs 9.8
reflection 9.7
hotel 9.5
barroom 9.4
winter 9.4
destination 9.3
relax 9.3
relaxation 9.2
leisure 9.1
stall 9.1
center 8.9
furniture 8.9
home 8.9
urban 8.7
glass 8.7
parasol 8.7
office 8.7
cloud 8.6
room 8.5
palm 8.4
shore 8.4
boat 8.3
lights 8.3
light 8
cafeteria 7.9
holiday 7.9
scene 7.8
tree 7.7
tropical 7.7
outdoor 7.6
door 7.6
trip 7.5
vacations 7.5
wall 7.5
evening 7.5
street 7.4
design 7.3
lifestyle 7.2
scenic 7
patio 7

Google
created on 2021-12-14

Photograph 94.2
Black 89.7
Black-and-white 86.1
Style 84.1
Tent 82.9
Line 82
Monochrome photography 75.8
Monochrome 75.7
Beauty 75
Shade 74.4
Snapshot 74.3
Chair 73.5
Art 73.2
Crowd 72.2
Building 71.6
Event 70.1
Fun 66.6
Leisure 65.8
Stock photography 64.9
Room 64.2

Microsoft
created on 2021-12-14

text 77.7
sky 70.7
playground 69.6
white 65.2
tent 55.8

Face analysis

Amazon

Google

AWS Rekognition

Age 11-21
Gender Male, 59.8%
Sad 46.5%
Fear 19.7%
Calm 12.5%
Surprised 10.8%
Confused 7%
Angry 2.3%
Disgusted 0.6%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tent 95.5%

Captions

Microsoft

a vintage photo of a group of people standing in front of a building 77.6%
a vintage photo of a group of people in front of a building 77.5%
a vintage photo of a group of people in a tent 77.4%

Text analysis

Amazon

YT33A2
MU YT33A2 ACCHA
MU
ACCHA

Google

YT33A2_032MA
M 13 YT33A2_032MA
M
13