Human Generated Data

Title

Untitled (elevated view of livestock ring with audience watching)

Date

1956

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6562

Human Generated Data

Title

Untitled (elevated view of livestock ring with audience watching)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6562

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Human 99.1
Person 99.1
Person 97.9
Nature 96.3
Outdoors 85.7
Person 74.2
People 66.3
Indoors 60
Landscape 59.7
Urban 59.5
Weather 57.6
Room 56.9
Person 42.5

Clarifai
created on 2019-03-26

people 99
group 95.7
no person 91.9
empty 90.7
group together 89.8
many 88.9
vehicle 88.9
urban 88.8
adult 88.8
light 88.5
room 88.5
city 88.2
architecture 88.2
art 87.1
monochrome 86.7
building 86.1
crowd 85.7
one 85.2
old 84.6
wall 84.6

Imagga
created on 2019-03-26

negative 32.4
film 28.1
old 27.9
vintage 22.3
grunge 22.1
photographic paper 20.3
architecture 20
wall 19.3
ancient 18.2
texture 18.1
travel 17.6
newspaper 15.9
stone 15.4
city 15
landscape 14.9
antique 14.7
structure 13.9
photographic equipment 13.5
art 13.4
tunnel 13.2
aged 12.7
building 12.6
park 12.5
creation 12.3
product 12.2
detail 12.1
tourism 11.5
rock 11.3
sky 10.8
history 10.7
color 10.6
daily 10.6
pattern 10.3
light 10.1
mountain 9.9
retro 9.8
river 9.8
urban 9.6
grungy 9.5
culture 9.4
scenery 9
black 9
brown 8.8
textured 8.8
decoration 8.5
map 8.5
monument 8.4
world 8.4
sculpture 8.3
dirty 8.1
marble 8.1
paper 8
mine 8
scenic 7.9
design 7.9
forest 7.8
construction 7.7
worn 7.6
weathered 7.6
snow 7.5
frame 7.5
style 7.4
exterior 7.4
landmark 7.2
surface 7.1

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

white 66.4
people 59.2
old 56.3
crowd 1.1
black and white 1.1
art 0.9
monochrome 0.4
music 0.3
grand 0.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.5%
Sad 49.6%
Disgusted 49.5%
Happy 49.5%
Calm 50.3%
Angry 49.6%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 15-25
Gender Male, 50.3%
Angry 49.5%
Happy 49.6%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 50.2%
Sad 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Sad 50.4%
Calm 49.5%
Confused 49.5%
Surprised 49.5%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.5%
Sad 49.6%
Confused 49.5%
Happy 49.9%
Angry 49.6%
Calm 49.7%
Disgusted 49.6%
Surprised 49.6%

AWS Rekognition

Age 11-18
Gender Female, 50.4%
Happy 49.6%
Confused 49.6%
Calm 49.7%
Sad 49.8%
Disgusted 49.5%
Angry 49.6%
Surprised 49.6%

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

BAR
HOR6ESHOE BAR
HOR6ESHOE
1000O