Human Generated Data

Title

Untitled (woman laughing while watching boy sitting in toy car inside Christmas living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9148

Human Generated Data

Title

Untitled (woman laughing while watching boy sitting in toy car inside Christmas living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9148

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Wheel 95.7
Machine 95.7
Person 95.5
Motorcycle 81.4
Transportation 81.4
Vehicle 81.4
Person 77.5

Clarifai
created on 2023-10-26

people 100
vehicle 99.8
transportation system 99.1
group together 99
adult 98.9
two 98.6
group 98
three 97.2
man 96.4
several 95.3
car 95.3
driver 95.2
administration 93.9
four 93.1
one 92.6
wear 90.2
child 88.7
woman 86.8
recreation 85
many 84.7

Imagga
created on 2022-01-23

chair 41.3
wheelchair 41.2
seat 28.9
man 27.5
person 25.6
tricycle 25.6
wheeled vehicle 23.7
adult 22.7
people 22.3
vehicle 18.5
sitting 17.2
male 16.4
happy 16.3
furniture 16
outdoors 14.9
portrait 14.9
musical instrument 14.7
lifestyle 14.5
smiling 13.7
casual 13.6
fun 12.7
senior 12.2
smile 12.1
park 11.5
conveyance 11.5
day 11
care 10.7
medical 10.6
pretty 10.5
attractive 10.5
mature 10.2
teenager 10
rocking chair 10
outdoor 9.9
lady 9.7
health 9.7
work 9.6
elderly 9.6
men 9.4
room 9.2
holding 9.1
old 9.1
kin 9
one 9
working 8.8
home 8.8
retired 8.7
couple 8.7
women 8.7
bench 8.6
youth 8.5
business 8.5
relax 8.4
black 8.4
hand 8.4
city 8.3
fashion 8.3
furnishing 8.3
cheerful 8.1
sexy 8
computer 8
indoors 7.9
love 7.9
retirement 7.7
joy 7.5
leisure 7.5
teen 7.3
alone 7.3
school 7.3
aged 7.2
office 7.2
stringed instrument 7.2
looking 7.2
transportation 7.2
family 7.1
job 7.1
businessman 7.1
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.4
outdoor 91.8
black and white 87
wheel 77.2
person 54.9
land vehicle 53.4
vehicle 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 94.8%
Calm 97.6%
Sad 1.8%
Happy 0.3%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%
Surprised 0%

AWS Rekognition

Age 29-39
Gender Male, 85.6%
Happy 81.7%
Calm 16%
Disgusted 0.5%
Sad 0.5%
Surprised 0.5%
Angry 0.3%
Fear 0.3%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Wheel 95.7%
Motorcycle 81.4%

Categories

Imagga

cars vehicles 92.8%
paintings art 4.8%

Text analysis

Amazon

LICW
WITH LICW
WITH
13150