Human Generated Data

Title

Untitled (young boy and baby girl playing with 7-Up bottles in living room)

Date

1947

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10013

Human Generated Data

Title

Untitled (young boy and baby girl playing with 7-Up bottles in living room)

People

Artist: Martin Schweig, American 20th century

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10013

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.6
Human 99.6
Clothing 99.2
Apparel 99.2
Person 97.7
Furniture 96.5
Dress 94.8
Suit 94.3
Overcoat 94.3
Coat 94.3
Couch 94.2
Shoe 90.9
Footwear 90.9
Chair 90.4
Female 85.7
Woman 72.4
Building 70.2
Architecture 68.4
Face 65.1
Sitting 63.9
Plant 63.5
Photography 60.9
Photo 60.9
Indoors 60.2
Girl 58.4
Tuxedo 57.7
Tree 56.8
Shorts 56.3
Glasses 55.9
Accessories 55.9
Accessory 55.9
Person 43.8

Clarifai
created on 2023-10-27

people 99.9
child 97.6
adult 97.2
two 97.1
woman 96.3
monochrome 95.4
man 94.6
elderly 94.5
group 94.2
sit 91.2
furniture 90.3
three 90
family 90
administration 89.3
bench 89
recreation 88.8
offspring 88.6
leader 86.5
street 86.5
group together 86.1

Imagga
created on 2022-01-28

man 32.2
people 29.6
person 26
male 23.5
newspaper 19.5
business 18.2
businessman 17.7
adult 16.6
grandma 16.4
cleaner 15.4
product 14.9
urban 14.9
sitting 14.6
black 14.4
room 14.3
office 13.8
men 13.7
city 13.3
couple 13.1
alone 12.8
two 11.9
love 11.8
creation 11.7
human 11.2
looking 11.2
world 10.9
lifestyle 10.8
portrait 10.3
teacher 10.3
chair 10.2
back 10.1
silhouette 9.9
lady 9.7
one 9.7
indoors 9.7
home 9.6
building 9.5
casual 9.3
holding 9.1
old 9.1
suit 9
life 8.9
hairdresser 8.9
working 8.8
computer 8.8
happy 8.8
hair 8.7
women 8.7
day 8.6
wall 8.5
youth 8.5
senior 8.4
street 8.3
indoor 8.2
dress 8.1
family 8
child 7.9
together 7.9
happiness 7.8
education 7.8
modern 7.7
professional 7.6
walk 7.6
finance 7.6
walking 7.6
outdoors 7.6
meeting 7.5
window 7.5
school 7.2
interior 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 97.1
clothing 93.7
person 92.2
black and white 92
statue 82.7
monochrome 53.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 14-22
Gender Female, 59.6%
Happy 70.9%
Calm 13.5%
Surprised 6.8%
Sad 3.8%
Angry 2.5%
Disgusted 1.3%
Fear 0.9%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Female, 54.9%
Calm 97.6%
Sad 1.7%
Happy 0.3%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Couch
Shoe
Person 99.6%
Person 97.7%
Person 43.8%
Couch 94.2%
Shoe 90.9%

Text analysis

Amazon

7up
MJIR
115021
MJIR YT3RAS ООГИА
YT3RAS
ООГИА
et

Google

MJIR YT
MJIR
YT