Human Generated Data

Title

Untitled (five clergymen posed at front of church)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9358

Human Generated Data

Title

Untitled (five clergymen posed at front of church)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9358

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 98.8
Apparel 98.8
Person 98.3
Human 98.3
Person 97.3
Person 96.6
Person 95.1
Person 89.1
Priest 87.3
Coat 72.4
People 71.7
Overcoat 71.7
Bishop 65.7
Dress 62.2
Face 61.1
Hat 57.7
Flooring 55.2

Clarifai
created on 2023-10-26

people 99.8
group 98
wear 97.9
adult 97.5
man 95
woman 94.4
coat 93.5
gown (clothing) 93.3
administration 93.2
outfit 92.8
leader 92.1
street 90.1
outerwear 90.1
group together 89.3
child 88.7
three 86.1
veil 85.5
several 85.5
ceremony 83.8
priest 82.4

Imagga
created on 2022-01-23

metropolitan 100
building 23.9
architecture 23.4
old 20.9
clothing 20.1
vestment 20.1
people 19.5
city 18.3
gown 17.3
tourism 16.5
history 16.1
street 15.6
man 14.8
religion 14.3
urban 14
stone 13.5
travel 13.4
church 12.9
business 12.7
outerwear 12.4
adult 12.4
monument 12.1
tourist 11.8
arch 10.7
statue 10.5
clothes 10.3
historic 10.1
jacket 9.6
sculpture 9.5
window 9.5
wall 9.4
historical 9.4
male 9.2
black 9
scene 8.7
ancient 8.6
cathedral 8.6
men 8.6
door 8.6
walking 8.5
famous 8.4
garment 8.2
landmark 8.1
group 8.1
coat 7.8
culture 7.7
blurred 7.7
human 7.5
place 7.4
tradition 7.4
indoors 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 89.7
clothing 86.9
snow 84.5
window 83.3
person 76.8
clothes 19

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 100%
Confused 27.2%
Happy 27.2%
Calm 21.3%
Sad 12.4%
Angry 6.2%
Disgusted 3%
Surprised 2.3%
Fear 0.5%

AWS Rekognition

Age 25-35
Gender Male, 79.1%
Calm 98.4%
Surprised 1%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 42-50
Gender Male, 100%
Sad 68.9%
Surprised 12.1%
Calm 6%
Confused 5.1%
Fear 2.9%
Happy 2.2%
Disgusted 1.4%
Angry 1.3%

AWS Rekognition

Age 28-38
Gender Male, 89.9%
Calm 99.3%
Sad 0.4%
Happy 0.2%
Surprised 0%
Angry 0%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 49-57
Gender Male, 100%
Calm 99.1%
Surprised 0.5%
Sad 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Categories

Text analysis

Amazon

e
#03 e S
S
#03

Google

te 3 ea YT3RA2券ACO>
te
3
ea
YT3RA2
ACO
>