Human Generated Data

Title

Untitled (couple examining gifts as girls sits on rocking horse and others sit on couches)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9136

Human Generated Data

Title

Untitled (couple examining gifts as girls sits on rocking horse and others sit on couches)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9136

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Clothing 97.6
Apparel 97.6
Person 96.1
Person 95.1
Chair 91.8
Furniture 91.8
Person 90.5
Person 86.7
Person 76.8
Person 73.4
Face 68.5
Shorts 68.2
People 68
Meal 65.8
Food 65.8
Portrait 64
Photography 64
Photo 64
Kid 60.8
Child 60.8
Shoe 60.1
Footwear 60.1

Clarifai
created on 2023-10-26

people 99.9
group 99.1
adult 98.7
woman 96.7
wear 96.6
group together 96.4
child 96.3
two 96
music 95.1
actress 93.3
recreation 93
outfit 92.9
three 92.7
man 92.4
musician 92.4
facial expression 92.2
singer 91.8
several 90.8
one 88.1
movie 87.4

Imagga
created on 2022-01-23

person 32.5
man 31.6
adult 27.5
people 26.2
male 23.4
men 18.9
black 18
lifestyle 16.6
fashion 15.8
portrait 15.5
office 15.5
sitting 14.6
business 14.6
musical instrument 14.1
professional 13.7
salon 13.3
indoors 13.2
computer 12.9
sexy 12.8
casual 12.7
one 12.7
style 12.6
room 12.6
chair 12.3
equipment 12.2
looking 12
women 11.9
model 11.7
job 11.5
businessman 11.5
modern 11.2
mature 11.2
two 11
face 10.6
studio 10.6
human 10.5
youth 10.2
music 10
indoor 10
wind instrument 9.9
bass 9.6
shop 9.6
barbershop 9.5
back 9.2
occupation 9.2
alone 9.1
attractive 9.1
interior 8.8
working 8.8
body 8.8
musician 8.7
couple 8.7
active 8.6
businesspeople 8.5
senior 8.4
teacher 8.4
color 8.3
handsome 8
work 7.8
pretty 7.7
old 7.7
career 7.6
communication 7.6
device 7.5
dark 7.5
clothing 7.5
guy 7.5
photographer 7.5
accordion 7.5
shirt 7.5
laptop 7.4
suit 7.4
holding 7.4
glasses 7.4
lady 7.3
home 7.2
smile 7.1
monitor 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
person 96.7
black and white 89.3
drawing 60.1
clothing 55.8
street 52.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 88%
Happy 64.4%
Surprised 13.7%
Calm 9.5%
Angry 4.8%
Sad 2.9%
Disgusted 2.4%
Confused 1.7%
Fear 0.4%

AWS Rekognition

Age 31-41
Gender Female, 59.9%
Calm 98.9%
Sad 0.7%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Confused 0%
Angry 0%

AWS Rekognition

Age 37-45
Gender Female, 99.8%
Happy 95.7%
Calm 2.5%
Sad 0.7%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 97.7%
Calm 62.6%
Surprised 33.8%
Happy 1.7%
Sad 0.7%
Angry 0.4%
Disgusted 0.4%
Confused 0.4%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Female, 97.6%
Calm 61%
Surprised 8.9%
Happy 8.5%
Angry 7.9%
Fear 6%
Disgusted 4.2%
Sad 2.6%
Confused 0.9%

AWS Rekognition

Age 23-31
Gender Male, 96.3%
Surprised 59.6%
Happy 26.7%
Calm 11.6%
Angry 0.5%
Fear 0.4%
Confused 0.4%
Disgusted 0.4%
Sad 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 60.1%

Categories

Captions

Text analysis

Amazon

3
3 ٢٢
٢٢
MJ13
MJ13 YE33AS DOSHA
YE33AS
DOSHA