Human Generated Data

Title

Untitled (parents helping son to cut cake while others watch at Bar Mitzvah table)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9539

Human Generated Data

Title

Untitled (parents helping son to cut cake while others watch at Bar Mitzvah table)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9539

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Person 99.2
Person 99.1
Person 98.9
Person 98.3
Person 97.6
Person 96.9
Robe 94.5
Fashion 94.5
People 93.5
Dress 93.3
Gown 93.2
Wedding 92.5
Person 91.6
Suit 90.6
Overcoat 90.6
Coat 90.6
Bridegroom 88.8
Face 88.4
Bride 85.2
Wedding Gown 85.2
Person 81.6
Cake 81.3
Dessert 81.3
Food 81.3
Female 77.2
Meal 75.3
Tablecloth 71.7
Flower 71.7
Blossom 71.7
Plant 71.7
Person 71.2
Portrait 68
Photography 68
Photo 68
Woman 67.1
Icing 63.9
Cream 63.9
Creme 63.9
Table 63
Furniture 63
Helmet 62.7
Man 61.7
Flower Arrangement 57.5
Crowd 56.1

Clarifai
created on 2023-10-27

people 99.9
group 99
group together 98.8
man 97.8
woman 96.3
adult 95.9
many 95.4
family 89.6
administration 88.5
leader 88.2
several 86.3
five 83.7
elderly 80.6
furniture 79.3
room 78.8
wear 77.6
four 77.5
child 76.5
recreation 69.1
music 66.2

Imagga
created on 2022-01-28

marimba 100
percussion instrument 100
musical instrument 100
man 34.9
people 30.6
male 29.7
smiling 23.1
women 22.9
men 22.3
happy 21.9
cheerful 20.3
sitting 19.7
couple 19.1
senior 18.7
group 17.7
person 17.1
business 17
happiness 16.4
adult 16.2
businessman 14.1
room 13.7
home 12.7
love 12.6
portrait 12.3
together 12.2
meeting 12.2
lifestyle 11.6
30s 11.5
indoors 11.4
friends 11.3
table 11.2
office 11.2
team 10.7
family 10.7
fun 10.5
friendship 10.3
vibraphone 10.2
mature 10.2
two 10.2
standing 9.5
holding 9.1
black 9
colleagues 8.7
four 8.6
drinking 8.6
smile 8.5
enjoyment 8.4
mother 8.4
joy 8.3
wine 8.3
outdoors 8.2
suit 8.1
celebration 8
holiday 7.9
work 7.8
day 7.8
wall 7.7
talking 7.6
enjoying 7.6
desk 7.5
relationship 7.5
leisure 7.5
training 7.4
vacation 7.4
water 7.3
girls 7.3

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 99.2
old 96.3
wall 95.7
clothing 95.1
standing 83.4
posing 82.7
text 81.3
black 80.8
group 74.7
white 62.4
woman 60.5
man 59.7
black and white 52.8
vintage 34

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Female, 93.1%
Sad 53.8%
Calm 13%
Happy 12.3%
Confused 6.6%
Angry 4.4%
Disgusted 3.8%
Surprised 3.5%
Fear 2.7%

AWS Rekognition

Age 41-49
Gender Male, 96.9%
Calm 41.2%
Sad 40.9%
Happy 5.9%
Angry 3.6%
Fear 3%
Surprised 2.3%
Confused 2.3%
Disgusted 0.9%

AWS Rekognition

Age 41-49
Gender Male, 99.6%
Calm 98.8%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%
Confused 0.2%
Surprised 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Happy 44.3%
Sad 22.2%
Confused 12.4%
Disgusted 11.5%
Surprised 4%
Calm 2.5%
Angry 2%
Fear 1%

AWS Rekognition

Age 38-46
Gender Female, 94.5%
Calm 99.8%
Sad 0.1%
Happy 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 95.9%
Calm 68.2%
Surprised 20.6%
Happy 4.1%
Sad 3.3%
Fear 2.1%
Confused 0.8%
Angry 0.5%
Disgusted 0.4%

AWS Rekognition

Age 42-50
Gender Male, 99.5%
Calm 98.6%
Sad 0.4%
Angry 0.3%
Happy 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 99%
Calm 99.3%
Sad 0.6%
Angry 0.1%
Confused 0%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 90.8%
Calm 50.3%
Surprised 30.1%
Happy 11.2%
Confused 3.7%
Sad 2%
Fear 1%
Disgusted 0.9%
Angry 0.8%

AWS Rekognition

Age 30-40
Gender Female, 85.3%
Happy 66.5%
Sad 23.3%
Calm 6.6%
Confused 1.1%
Disgusted 0.8%
Surprised 0.7%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 43-51
Gender Male, 59.2%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 99.6%
Person 99.2%
Person 99.1%
Person 98.9%
Person 98.3%
Person 97.6%
Person 96.9%
Person 91.6%
Person 81.6%
Person 71.2%
Helmet 62.7%

Text analysis

Amazon

OT
KODAKA-EITW

Google

MJI7--YT37A°2 - - XAGO
MJI7--YT37A°2
-
XAGO