Human Generated Data

Title

Untitled (woman and young girl posing with doll while boy in toy car watches inside Christmas living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9150

Human Generated Data

Title

Untitled (woman and young girl posing with doll while boy in toy car watches inside Christmas living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9150

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Person 94.5
Person 79.7
Workshop 75
People 74.3
Furniture 71.8
Clinic 61.2
Backpack 60.9
Bag 60.9
Clothing 58.9
Apparel 58.9
Transportation 56.3

Clarifai
created on 2023-10-26

people 99.9
child 98.9
group 98.4
group together 97.9
wear 95.6
recreation 95.2
woman 94.9
adult 94.4
outfit 92.8
man 92.3
several 92.2
boy 90.9
veil 90.3
many 90.2
vehicle 87.4
furniture 85.4
three 84.9
actress 84.8
four 81.6
military 80

Imagga
created on 2022-01-23

barbershop 100
shop 100
mercantile establishment 82.4
place of business 54.9
salon 28
establishment 27.4
man 26.9
people 20.1
chair 18.1
male 17.7
adult 17.5
men 17.2
person 15
old 13.9
women 13.4
style 13.3
barber chair 12.2
home 12
working 11.5
indoors 11.4
black 11.4
room 11.1
hair 11.1
window 11
music 10.8
family 10.7
work 10.2
lifestyle 10.1
back 10.1
occupation 10.1
business 9.7
portrait 9.7
office 9.6
youth 9.4
seat 9.3
hairdresser 9.3
city 9.1
vintage 9.1
fashion 9
interior 8.8
job 8.8
scene 8.6
sitting 8.6
smile 8.5
professional 8.5
art 8.5
house 8.3
hand 8.3
happy 8.1
couple 7.8
life 7.8
model 7.8
casual 7.6
two 7.6
care 7.4
retro 7.4
entertainment 7.4
girls 7.3
sexy 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.2
black and white 92.2
clothing 90.9
street 88.9
person 86.9
footwear 60.9
monochrome 56.8
store 31.6
clothes 17.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Happy 87.4%
Confused 3.9%
Calm 3.4%
Sad 2.3%
Fear 1.2%
Disgusted 0.8%
Surprised 0.7%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Backpack 60.9%

Text analysis

Amazon

8
MJI7

Google

MJ17 YT3RA2 02MA
MJ17
YT3RA2
02MA