Human Generated Data

Title

Untitled (eight family members posed looking at each other in dining room)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9351

Human Generated Data

Title

Untitled (eight family members posed looking at each other in dining room)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9351

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 99
Person 98.9
Person 97.3
Person 97
Person 96.9
Person 96.7
Person 94.3
Priest 87.2
Clothing 84.5
Apparel 84.5
People 81.4
Church 73.2
Building 73.2
Architecture 73.2
Robe 69.5
Fashion 69.5
Face 65.1
Gown 64.2
Female 64
Wedding 63.7
Bishop 59.3
Altar 57.9
Stage 57
Portrait 56.9
Photography 56.9
Photo 56.9
Suit 56.8
Coat 56.8
Overcoat 56.8
Dress 56.7
Bridegroom 55.4

Clarifai
created on 2023-10-27

people 99.9
group 99.7
adult 98.6
woman 98
medical practitioner 97.2
man 96.7
group together 96.7
many 94.4
leader 90.5
several 89.1
child 88.3
uniform 86.3
wear 85.7
four 85.3
three 84.9
administration 83.9
elderly 83.5
education 81.8
healthcare 81
five 81

Imagga
created on 2022-01-23

marimba 72.7
percussion instrument 62.7
musical instrument 50.9
man 25.5
black 21.6
people 20.1
male 19.1
blackboard 16.4
person 15.6
men 15.4
adult 13.8
water 12.7
women 12.6
businessman 12.3
business 12.1
two 11.8
vintage 10.7
group 10.5
couple 10.4
art 10.4
smiling 10.1
lifestyle 10.1
old 9.7
office 9.6
education 9.5
sitting 9.4
light 9.3
room 9.2
outdoors 8.9
cheerful 8.9
happiness 8.6
cadaver 8.5
indoors 7.9
grunge 7.7
relaxation 7.5
happy 7.5
leisure 7.5
dirty 7.2
school 7.2
team 7.2
classroom 7.1
smile 7.1
to 7.1
modern 7

Google
created on 2022-01-23

Outerwear 95.4
Photograph 94.2
Coat 87.3
Suit 76.4
Plant 76.3
Snapshot 74.3
Font 74.2
Monochrome 73.7
Event 71.8
Art 71.6
Vintage clothing 70.5
Monochrome photography 70.5
Room 69.6
History 67.8
Stock photography 65
Visual arts 55.5
Classic 53.1
Holy places 52.5

Microsoft
created on 2022-01-23

person 98.3
text 96.4
clothing 92.8
man 90
standing 79
funeral 69
candle 68.1
black 65.4
white 60.4
posing 58.2
woman 54.2
old 47.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Calm 73%
Happy 12.5%
Surprised 10.5%
Confused 1.2%
Sad 0.8%
Disgusted 0.8%
Fear 0.7%
Angry 0.4%

AWS Rekognition

Age 29-39
Gender Male, 98.5%
Calm 92.1%
Angry 3%
Surprised 1.8%
Happy 1.6%
Confused 0.7%
Sad 0.4%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 98.4%
Confused 92.4%
Calm 5%
Happy 0.9%
Sad 0.9%
Disgusted 0.5%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 41-49
Gender Female, 95.5%
Calm 96.1%
Sad 1.7%
Happy 0.7%
Disgusted 0.4%
Angry 0.3%
Confused 0.3%
Surprised 0.3%
Fear 0.1%

AWS Rekognition

Age 54-62
Gender Female, 85.8%
Calm 82.5%
Happy 16%
Sad 0.5%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 51-59
Gender Male, 99.7%
Happy 51.8%
Sad 35.9%
Surprised 5.1%
Confused 4.5%
Fear 0.8%
Disgusted 0.7%
Calm 0.7%
Angry 0.6%

AWS Rekognition

Age 35-43
Gender Male, 52.1%
Surprised 74.2%
Happy 18.6%
Calm 5.2%
Sad 0.9%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Text analysis

Amazon

L
MJI7
A
1
MJI7 ACSNA
J
ACSNA
U

Google

2 a a MJ17 YT37A2 A
2
a
MJ17
YT37A2
A