Human Generated Data

Title

Untitled (Tynes Street Baptist Church groundbreaking)

Date

1941

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2116

Human Generated Data

Title

Untitled (Tynes Street Baptist Church groundbreaking)

People

Artist: Hamblin Studio, American active 1930s

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2116

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 96.7
Human 96.7
Person 94.8
Person 93.1
Person 91.5
Nature 91.4
Outdoors 90.9
Person 89.8
Person 89.5
Building 85.6
Person 85.2
Shelter 83.8
Rural 83.8
Countryside 83.8
People 82.8
Housing 73.6
Person 73.3
Person 72
Tree 70.1
Plant 70.1
Person 69.2
Person 65.4
House 63.6
Crowd 61.7
Architecture 61.7
Person 60.4
Photography 60
Photo 60
Winter 59.5
Villa 59.3
Snow 58.4
Church 56.5

Clarifai
created on 2023-10-15

people 99.4
group 98
religion 95.8
building 95.4
architecture 94.7
adult 93.7
man 92.2
temple 90.9
cemetery 87
home 86.3
many 84.5
grave 83.4
monochrome 82.3
woman 80.8
statue 80.5
child 80.3
art 79.1
monument 77.7
house 77.2
group together 75.7

Imagga
created on 2021-12-14

picket fence 100
fence 100
barrier 85.3
obstruction 57.2
architecture 53.2
building 48.5
structure 41.1
cemetery 35
old 32.1
history 31.4
travel 30.3
sky 30
house 26.5
church 25.9
city 25
tourism 23.1
historic 23
tower 22.4
religion 21.5
ancient 20.8
culture 20.5
roof 18.1
brick 17.9
town 16.7
palace 16.7
landscape 16.4
historical 16
famous 15.8
temple 15.8
stone 15.5
monument 15
wall 14.8
window 14.7
tree 14.6
dome 14.5
landmark 14.5
construction 13.7
tourist 13.6
traditional 13.3
exterior 12.9
winter 12.8
snow 12.6
clouds 11.9
cathedral 11.5
wooden 11.4
antique 11.3
religious 11.3
outdoors 10.5
capital 10.5
castle 10.3
day 10.2
trees 9.8
museum 9.7
rural 9.7
worship 9.7
heritage 9.7
urban 9.6
buildings 9.5
hill 9.4
outdoor 9.2
wood 9.2
night 8.9
village 8.7
god 8.6
cross 8.5
residence 8
home 8
fortress 8
country 7.9
orthodox 7.8
scene 7.8
summer 7.7
place 7.5
water 7.4
monastery 7.3
art 7.2
river 7.1
grass 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

building 99.7
outdoor 98.5
text 94
sky 84.9
black 79.6
white 74.7
black and white 70.5
old 66.6
church 58.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 52-70
Gender Male, 84.9%
Calm 85%
Sad 9.4%
Angry 2.3%
Confused 1.5%
Disgusted 0.8%
Happy 0.5%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 51-69
Gender Female, 82.9%
Happy 70%
Calm 28.6%
Surprised 0.5%
Sad 0.4%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 11-21
Gender Female, 50.7%
Happy 73.4%
Calm 20.8%
Surprised 1.9%
Angry 1.1%
Sad 0.9%
Disgusted 0.7%
Confused 0.6%
Fear 0.6%

AWS Rekognition

Age 19-31
Gender Male, 85.3%
Happy 66.6%
Calm 20.2%
Disgusted 6.5%
Angry 2.9%
Sad 2%
Confused 1.2%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 32-48
Gender Male, 50%
Happy 83.3%
Angry 5.7%
Sad 2.7%
Calm 2.6%
Disgusted 2.4%
Confused 2%
Surprised 0.7%
Fear 0.6%

AWS Rekognition

Age 49-67
Gender Female, 82.7%
Happy 65.3%
Calm 13.4%
Angry 7.9%
Sad 6.3%
Fear 4.8%
Surprised 1%
Disgusted 0.7%
Confused 0.6%

AWS Rekognition

Age 54-72
Gender Male, 87%
Happy 60.4%
Calm 28.8%
Surprised 3.1%
Angry 3%
Sad 2.2%
Confused 1.6%
Fear 0.5%
Disgusted 0.5%

AWS Rekognition

Age 42-60
Gender Female, 94.9%
Happy 46.9%
Surprised 19.4%
Calm 12%
Fear 7.9%
Sad 4.3%
Disgusted 3.9%
Angry 2.8%
Confused 2.8%

Feature analysis

Amazon

Person 96.7%

Categories

Text analysis

Amazon

АЗДА
EIT
M