Human Generated Data

Title

Fiedler at the Pops

Date

1947

People

Artist: Harold Edgerton, American 1903 - 1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of The Harold and Esther Edgerton Family Foundation, P1996.86

Human Generated Data

Title

Fiedler at the Pops

People

Artist: Harold Edgerton, American 1903 - 1990

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Hall 99.3
Auditorium 99.3
Room 99.3
Indoors 99.3
Theater 99.3
Human 99.2
Person 99.2
Interior Design 96.9
Crowd 96.8
Audience 93.7
Lighting 84.2
Concert 70.8
Stage 59
Dress 58.1
Apparel 58.1
Clothing 58.1
Person 55.8
Coat 55.8
Overcoat 55.8
Person 43.8

Imagga
created on 2022-02-26

building 43
cinema 34
architecture 33.6
theater curtain 31.9
theater 30.8
structure 30.7
curtain 28.2
city 23.3
blind 21.1
old 20.9
window 20.5
blackboard 20.4
urban 15.7
art 15.7
landmark 15.3
wall 15.3
protective covering 15.1
church 13.9
billboard 13.3
famous 13
hall 13
light 12.7
religion 12.5
night 12.4
bridge 12.4
ancient 12.1
travel 12
vintage 11.6
tourism 11.6
interior 11.5
antique 11.3
modern 11.2
design 10.7
signboard 10.4
cathedral 10.1
historic 10.1
house 10
covering 9.7
buildings 9.5
historical 9.4
decoration 9.4
grunge 9.4
lights 9.3
tower 9
retro 9
construction 8.6
dark 8.4
texture 8.3
symbol 8.1
history 8.1
film 8
business 7.9
glass 7.8
culture 7.7
door 7.6
capital 7.6
cityscape 7.6
religious 7.5
frame 7.5
traditional 7.5
monument 7.5
exterior 7.4
icon 7.1
river 7.1
mosque 7
sky 7

Google
created on 2022-02-26

Rectangle 87.8
Art 80.4
Font 77.9
Tints and shades 77.3
Urban design 74.1
Event 69.8
Symmetry 69.7
Metal 68.6
Visual arts 67.1
Presentation 64.6
Darkness 61.5
Square 61.2
City 61
Room 59.3
Magenta 57.3
Ceiling 55.8
Display device 54.7
Arch 54.6
Projection screen 54.6
Illustration 53.2

Microsoft
created on 2022-02-26

text 97.1
television 96.6
monitor 91.5
screenshot 85.3
screen 81.4
person 80.6
flat 31.7
picture frame 11.2

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Fear 36.4%
Surprised 34.2%
Calm 9.8%
Confused 8.2%
Angry 3.8%
Sad 2.9%
Disgusted 2.8%
Happy 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a flat screen television 72.2%
a flat screen television in a dark room 61.3%
a flat screen tv sitting in front of a television 55.5%