Human Generated Data

Title

Untitled (very large group posing outside Church of the Nazarene)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13550

Human Generated Data

Title

Untitled (very large group posing outside Church of the Nazarene)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13550

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Nature 99.7
Outdoors 99.3
Building 97.4
Countryside 95.7
Rural 92.7
Hut 89.3
Housing 80.7
Shack 79.8
Person 75.3
Human 75.3
House 64.6
Person 62.4
Shelter 62.1
Wheel 58.3
Machine 58.3
Person 57.3
Farm 56.4
Person 50.8
Person 45.3

Clarifai
created on 2023-10-29

house 99.6
landscape 99.3
winter 99.2
monochrome 98.8
snow 98.8
infrared 98.6
tree 97.8
light 97.5
storm 97
beach 96.5
building 96.3
roof 95.8
barn 95.5
lake 95.4
home 95.3
architecture 95.1
street 95
fog 94.6
lighthouse 93.8
sky 93.6

Imagga
created on 2022-02-04

barn 100
farm building 100
building 100
structure 72.4
landscape 35.7
sky 32.4
tree 29.3
clouds 27.9
trees 27.6
rural 22
field 20.9
old 20.2
countryside 19.2
country 18.5
scene 18.2
scenic 17.6
grass 15.8
summer 15.4
dark 15
wood 15
season 14.8
fog 14.5
sunset 14.4
sun 14.2
cloud 13.8
weather 13.8
horizon 13.5
morning 12.7
farm 12.5
light 12
land 12
scenery 11.7
architecture 11.7
outdoor 11.5
sunrise 11.3
cold 11.2
winter 11.1
day 11
environment 10.7
night 10.7
travel 10.6
agriculture 10.5
house 10
tourism 9.9
forest 9.6
natural 9.4
water 9.4
space 9.3
snow 9.2
peaceful 9.2
black 9
wooden 8.8
dawn 8.7
spring 8.6
bright 8.6
park 8.2
outdoors 8.2
meadow 8.1
river 8
foggy 7.9
mist 7.7
dusk 7.6
skyline 7.6
cloudy 7.5
evening 7.5
silhouette 7.5
vintage 7.5
lake 7.3
autumn 7
seasonal 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 98.7
black and white 92.4
tree 90.8
house 88.9
white 88.3
sky 85.8
black 79
night 69.6
monochrome 65
window 48.5
old 45.4
picture frame 34.6
display 27.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 80.8%
Angry 62.2%
Sad 9.2%
Calm 8.3%
Surprised 7.4%
Disgusted 5%
Confused 3.3%
Fear 2.8%
Happy 1.8%

AWS Rekognition

Age 21-29
Gender Male, 98.8%
Sad 96.4%
Calm 2.2%
Angry 0.7%
Confused 0.3%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 28-38
Gender Female, 64.3%
Calm 45.5%
Angry 19.7%
Sad 10.2%
Fear 9.8%
Happy 6.3%
Disgusted 4.1%
Surprised 2.3%
Confused 2%

AWS Rekognition

Age 16-22
Gender Male, 61%
Calm 34.1%
Fear 30%
Disgusted 12.1%
Angry 7.5%
Happy 6%
Sad 5.4%
Confused 2.5%
Surprised 2.2%

AWS Rekognition

Age 16-22
Gender Female, 72.8%
Calm 67.2%
Sad 20.7%
Angry 3.7%
Confused 2.7%
Disgusted 1.9%
Surprised 1.6%
Happy 1.1%
Fear 1%

AWS Rekognition

Age 22-30
Gender Male, 97.7%
Calm 98.3%
Sad 0.9%
Confused 0.2%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 99.3%
Sad 28.6%
Calm 26.3%
Happy 23.8%
Angry 11.7%
Confused 3.8%
Disgusted 2.3%
Surprised 2%
Fear 1.5%

AWS Rekognition

Age 21-29
Gender Female, 90.8%
Calm 79%
Sad 4.7%
Happy 4.6%
Disgusted 3.4%
Confused 3.2%
Angry 2.9%
Fear 1.4%
Surprised 0.8%

Feature analysis

Amazon

Person
Wheel
Person 75.3%
Person 62.4%
Person 57.3%
Person 50.8%
Person 45.3%
Wheel 58.3%

Text analysis

Amazon

9
NAZARINE
CHE
CHE RCH
RCH

Google

CHURCHNATA
CHURCHNATA