Human Generated Data

Title

Untitled (hoop-rolling race on grass)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19338

Human Generated Data

Title

Untitled (hoop-rolling race on grass)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19338

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Field 98.7
People 98.2
Human 98.2
Person 96.8
Person 95.6
Person 93.9
Team Sport 93.6
Sport 93.6
Team 93.6
Sports 93.6
Football 92.5
Building 92.4
Person 91
Person 89.8
Person 89
Person 88.3
Person 87.9
Person 87.3
Stadium 86.3
Arena 86.3
Person 77
Crowd 72.2
Plant 70.9
Person 68.2
Football Field 64.4
Grass 61.8
Person 60.8
Clothing 58.7
Apparel 58.7
Portrait 55.7
Photography 55.7
Face 55.7
Photo 55.7
American Football 55.6
Person 55.6
Person 53.1
Person 51.5

Clarifai
created on 2023-10-22

people 99.8
many 98.6
group 98.2
group together 98.1
cavalry 98.1
mammal 95.8
adult 95.7
canine 95.6
war 95.5
wear 91.9
military 91.9
dog 91.9
man 90.3
administration 90.1
outfit 87.8
leader 85.7
cemetery 84.6
funeral 83.5
crowd 83.2
campsite 82.9

Imagga
created on 2022-03-05

negative 40.1
gravestone 33.5
picket fence 32.4
film 31
memorial 27.7
fence 25.5
structure 25.3
cemetery 24.4
photographic paper 23.2
landscape 22.3
stone 21.4
tree 20.1
old 19.5
barrier 19.4
sky 18.5
trees 16.9
photographic equipment 15.5
travel 15.5
grunge 15.3
vintage 14.9
history 14.3
obstruction 13
antique 13
grass 12.6
architecture 12.5
park 12.3
house 11.9
snow 11.8
texture 10.4
winter 10.2
field 10
building 10
ancient 9.5
outdoors 9.1
tourism 9.1
summer 9
brown 8.8
rural 8.8
light 8.7
forest 8.7
spring 8.6
construction 8.6
weathered 8.5
grungy 8.5
outdoor 8.4
frame 8.3
city 8.3
countryside 8.2
retro 8.2
landmark 8.1
scenery 8.1
religion 8.1
scenic 7.9
textured 7.9
day 7.8
paper 7.8
space 7.8
clouds 7.6
old fashioned 7.6
grain 7.4
lake 7.3
wall 7.1
season 7

Google
created on 2022-03-05

Plant 92.1
Black 89.6
Tree 87.4
Grass 85.2
Black-and-white 83.2
Font 80.8
Rectangle 78.5
Monochrome photography 72.7
Landscape 72.4
Lawn 72.3
Event 70.3
Monochrome 70.2
Photo caption 68.2
Art 66.6
History 65.5
Room 65.4
Stock photography 63.7
Pole 62.7
Plantation 62.5
Visual arts 58.3

Microsoft
created on 2022-03-05

outdoor 98.7
grass 98.3
black and white 91.4
tree 84.4
black 80.2
text 79.9
white 78.4
field 75.5
grave 74
cemetery 67.3
sky 64.7
old 41.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 97.3%
Happy 81.6%
Calm 11.5%
Fear 2.9%
Sad 1.4%
Angry 0.9%
Surprised 0.8%
Disgusted 0.6%
Confused 0.2%

AWS Rekognition

Age 23-33
Gender Female, 97.5%
Surprised 41.7%
Fear 26.4%
Happy 12%
Calm 7.5%
Angry 4.5%
Confused 3.5%
Sad 2.3%
Disgusted 2%

AWS Rekognition

Age 33-41
Gender Male, 98.3%
Calm 54.6%
Happy 18.1%
Angry 15%
Disgusted 7.6%
Sad 2.3%
Surprised 1%
Fear 0.8%
Confused 0.6%

AWS Rekognition

Age 29-39
Gender Male, 90.4%
Calm 98.6%
Sad 0.4%
Fear 0.3%
Angry 0.3%
Happy 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-30
Gender Male, 93.8%
Sad 50.5%
Happy 30.4%
Fear 4.4%
Disgusted 4.2%
Calm 3.6%
Angry 3.1%
Confused 2.7%
Surprised 1.2%

Feature analysis

Amazon

Person
Person 96.8%
Person 95.6%
Person 93.9%
Person 91%
Person 89.8%
Person 89%
Person 88.3%
Person 87.9%
Person 87.3%
Person 77%
Person 68.2%
Person 60.8%
Person 55.6%
Person 53.1%
Person 51.5%

Captions

Text analysis

Google

YT33A2- A
YT33A2-
A