Human Generated Data

Title

Untitled (family posed with raised glasses in dining room with table decorated for Christmas)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9155

Human Generated Data

Title

Untitled (family posed with raised glasses in dining room with table decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9155

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.3
Person 99.1
Person 98
Person 97.9
Person 97.6
Meal 97.6
Food 97.6
Cake 97
Dessert 97
Icing 96.7
Cream 96.7
Creme 96.7
Dish 94.9
Person 94.8
Person 93.6
Clothing 91.5
Apparel 91.5
Tablecloth 91
Person 88.4
Sweets 82.7
Confectionery 82.7
People 81.6
Face 75
Chair 72.3
Furniture 72.3
Dinosaur 70.3
Animal 70.3
Reptile 70.3
Table 70
Home Decor 69.5
Portrait 63.5
Photography 63.5
Photo 63.5
Crowd 62.9
Person 60.7
Gown 57.3
Fashion 57.3
Suit 57.2
Coat 57.2
Overcoat 57.2
Female 56.2
Dress 55.9
Linen 55.1
Person 51.5

Clarifai
created on 2023-10-26

people 99.8
group together 98.1
group 96.8
woman 96.2
adult 96
child 95.7
many 95.6
man 95.6
recreation 90.8
wear 87
boy 86.6
audience 83.4
actress 83.4
several 83.3
spectator 83.1
five 79.6
music 79.5
administration 79.3
enjoyment 78.7
adolescent 77

Imagga
created on 2022-01-23

people 22.3
man 21.5
women 15.8
person 15.2
male 14.9
happy 14.4
adult 14.1
sitting 12
dress 11.7
lifestyle 11.6
cheerful 11.4
men 11.2
happiness 11
smiling 10.8
portrait 10.3
love 10.2
child 10.2
smile 10
leisure 10
bride 9.9
kin 9.7
black 9.6
life 9.5
outside 9.4
room 9.3
outdoor 9.2
business 9.1
old 9
outdoors 8.9
balcony 8.9
home 8.8
couple 8.7
down 8.5
vehicle 8.5
relaxation 8.4
world 8.4
bobsled 8.3
vintage 8.3
tourism 8.2
retro 8.2
interior 8
working 7.9
travel 7.7
attractive 7.7
two 7.6
togetherness 7.5
friends 7.5
fun 7.5
groom 7.4
wedding 7.3
back 7.3
lady 7.3
girls 7.3
group 7.2
color 7.2
computer 7.2
cute 7.2
clothing 7.1
face 7.1
indoors 7
vessel 7

Google
created on 2022-01-23

Outerwear 95.2
Photograph 94.2
Table 88
Style 83.8
Black-and-white 83.4
Font 79.4
Adaptation 79.2
Art 77.8
Snapshot 74.3
Event 73.3
Suit 72.6
Monochrome 71.2
Monochrome photography 70
Vintage clothing 69.4
Room 68.7
Rectangle 67.9
T-shirt 67.2
Visual arts 65.9
Tablecloth 65.5
History 62.5

Microsoft
created on 2022-01-23

person 99.4
birthday cake 96.3
clothing 92.3
people 87.1
text 81.3
window 81.1
group 78.5
wedding cake 74.8
woman 62
cake 58.6
posing 40.2
family 18.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 93.7%
Happy 98.6%
Sad 1%
Confused 0.1%
Calm 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 72.9%
Calm 69.6%
Sad 14.7%
Happy 12.7%
Confused 1.6%
Disgusted 0.6%
Surprised 0.4%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 97.5%
Happy 86.2%
Sad 9.7%
Surprised 1.9%
Fear 0.8%
Angry 0.5%
Confused 0.4%
Disgusted 0.3%
Calm 0.3%

AWS Rekognition

Age 37-45
Gender Male, 99%
Calm 96.8%
Surprised 2.7%
Happy 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 45-51
Gender Female, 61.5%
Happy 90.8%
Calm 5.2%
Surprised 2.3%
Sad 0.6%
Confused 0.4%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 99%
Happy 0.4%
Sad 0.2%
Surprised 0.1%
Angry 0.1%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 24-34
Gender Female, 98.6%
Calm 73.3%
Surprised 12.3%
Sad 6%
Fear 3.3%
Happy 2.9%
Confused 0.9%
Disgusted 0.8%
Angry 0.5%

AWS Rekognition

Age 28-38
Gender Male, 90.8%
Calm 62.2%
Sad 35.5%
Confused 1.5%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dinosaur 70.3%

Categories

Text analysis

Amazon

n
13150
M 113
M 113 ACTMA
ACTMA