Human Generated Data

Title

Untitled (model railroad on table in corner of room with four men watching train)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9138

Human Generated Data

Title

Untitled (model railroad on table in corner of room with four men watching train)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9138

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.1
Person 98.9
Shelf 94.5
Furniture 94.2
Person 92.3
Bookcase 88
Clothing 85.6
Apparel 85.6
Home Decor 83.9
Meal 80.3
Food 80.3
Indoors 68.5
Person 65.8
Table 65.6
Beverage 61.4
Drink 61.4
Female 60.9
Room 59.5
Dish 59.4
Alcohol 59.3
Sleeve 58.9
Flooring 57.3
Glass 55.2
Bed 55.2

Clarifai
created on 2023-10-26

people 99.7
monochrome 99.1
woman 97.8
adult 97.7
two 95.5
man 93.5
indoors 91.2
group 90.1
wedding 85
side view 83.8
couple 82.5
three 79.6
four 78.9
cooking 77.3
furniture 76
group together 75.5
sit 74.2
facial expression 73.7
meal 71.5
child 71.4

Imagga
created on 2022-01-23

man 38.3
people 32.9
person 31.8
couple 31.4
home 31.1
happy 30.7
male 29.1
adult 28.7
smiling 27.5
sitting 24.1
professional 19
men 18.9
lifestyle 18.8
smile 17.8
women 17.4
clinic 16.8
together 16.6
happiness 16.5
counter 16.2
senior 15.9
indoors 15.8
food 15.5
cheerful 15.4
30s 15.4
dinner 15
kitchen 14.8
20s 14.7
meal 14.6
indoor 14.6
two 14.4
portrait 14.2
family 14.2
attractive 14
mature 13.9
office 13.8
house 13.4
drink 13.4
clothing 13.3
pretty 13.3
table 13.2
holding 13.2
standing 13
room 12.9
business 12.8
casual 12.7
interior 12.4
restaurant 12.4
adults 12.3
eating 11.8
day 11.8
working 11.5
businessman 11.5
teacher 11.4
worker 10.8
team 10.8
colleagues 10.7
businesspeople 10.4
enjoying 10.4
mother 10.3
waiter 10.2
laptop 10
daughter 9.7
computer 9.6
meeting 9.4
clothes 9.4
lunch 9.3
service 9.3
wine 9.2
businesswoman 9.1
to 8.9
work 8.8
casual clothing 8.8
having 8.7
groom 8.7
educator 8.6
bright 8.6
life 8.4
horizontal 8.4
domestic 8.1
suit 8.1
group 8.1
handsome 8
grandma 8
looking 8
job 8
cooking 7.9
brunette 7.8
40s 7.8
color 7.8
staff 7.8
serving 7.7
mid adult 7.7
modern 7.7
white 7.7
elderly 7.7
husband 7.6
talking 7.6
wife 7.6
technology 7.4
shop 7.4
occupation 7.3
romantic 7.1
love 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.8
text 98.8
indoor 95.9
man 93.2
clothing 90.5
standing 90.3
black and white 67.8
table 62.2
preparing 41.5
shop 9.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.6%
Calm 95%
Sad 2.3%
Surprised 0.7%
Confused 0.7%
Disgusted 0.5%
Happy 0.4%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 33-41
Gender Male, 98.2%
Calm 95.8%
Sad 3.8%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

interior objects 99.6%

Text analysis

Amazon

YT3RAS
MJIR YT3RAS
MJIR
xt
xt Y
Y

Google

M YT3RA2 002MA
M
YT3RA2
002MA