Human Generated Data

Title

Untitled (crowd seated outside for church ceremony)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6249

Human Generated Data

Title

Untitled (crowd seated outside for church ceremony)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6249

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Nature 96.1
Person 95.8
Human 95.8
Outdoors 95.1
Furniture 89.3
Face 79.6
Person 77.9
Person 76.8
Tree 76
Plant 76
People 75.7
Person 74.9
Shelter 74.5
Countryside 74.5
Building 74.5
Rural 74.5
Person 72.9
Yard 71.1
Person 71
Person 70.9
Person 69.8
Person 65.8
Bench 65.6
Land 64
Boat 62.1
Transportation 62.1
Vehicle 62.1
Photography 60
Photo 60
Crowd 57.5
Person 50

Clarifai
created on 2023-10-26

people 99.9
many 99.3
adult 99
group 98.6
group together 96.7
man 93.5
chair 92.6
woman 92
furniture 91.8
home 91.8
administration 91.4
war 87
leader 86.9
child 85.7
crowd 84
seat 83.7
military 83.5
campsite 83.3
wear 82
cemetery 81.4

Imagga
created on 2022-01-22

barrow 41.1
handcart 36.4
wheeled vehicle 32.7
vehicle 21.1
old 20.9
cemetery 18.6
building 18.2
landscape 17.9
architecture 15.6
travel 15.5
house 15.1
water 14.7
sky 14.7
rural 14.1
structure 14.1
patio 13.4
area 13.4
trees 13.3
tourism 13.2
city 12.5
grunge 11.9
conveyance 11.6
tree 11.6
vintage 11.6
chair 11.2
stone 10.7
light 10.7
town 10.2
snow 9.9
seat 9.9
bench 9.8
scene 9.5
wall 9.5
sea 9.4
street 9.2
construction 8.6
outdoor 8.4
boat 8.4
dark 8.4
wood 8.3
outdoors 8.3
lake 8.2
sunset 8.1
mountain 8
scenic 7.9
brick 7.9
gravestone 7.8
antique 7.8
roof 7.8
winter 7.7
furniture 7.6
sun 7.2
road 7.2
scenery 7.2
black 7.2
transportation 7.2
art 7.2
history 7.2
paper 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

tree 98
grave 97
cemetery 95.8
black and white 92.4
text 88.4
outdoor 88.3
old 79.4
house 70.2
white 62.6
snow 51.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 93.7%
Calm 88.4%
Happy 6.1%
Sad 2.7%
Disgusted 0.9%
Fear 0.8%
Confused 0.5%
Angry 0.5%
Surprised 0.2%

AWS Rekognition

Age 21-29
Gender Male, 94.2%
Calm 74.1%
Happy 21.4%
Sad 1.7%
Disgusted 0.9%
Confused 0.6%
Surprised 0.5%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 26-36
Gender Male, 97.3%
Happy 58.2%
Sad 12%
Calm 9.5%
Disgusted 8.6%
Fear 5.2%
Surprised 2.5%
Angry 2.3%
Confused 1.6%

AWS Rekognition

Age 22-30
Gender Female, 59.7%
Fear 70.5%
Sad 10.9%
Calm 8.3%
Confused 4.8%
Disgusted 1.5%
Happy 1.4%
Angry 1.4%
Surprised 1.3%

AWS Rekognition

Age 22-30
Gender Male, 60.9%
Calm 62%
Sad 21.5%
Happy 10.9%
Angry 2.2%
Disgusted 1.8%
Confused 0.7%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Female, 75%
Calm 77.9%
Confused 8.3%
Surprised 4.8%
Sad 4.2%
Angry 1.9%
Disgusted 1.3%
Fear 1.1%
Happy 0.5%

Feature analysis

Amazon

Person 95.8%
Boat 62.1%

Categories

Text analysis

Amazon

YT37A8-XAOOX