Human Generated Data

Title

Untitled (young man posed feeding baby on couch next to woman and dog in foreground)

Date

1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9317

Human Generated Data

Title

Untitled (young man posed feeding baby on couch next to woman and dog in foreground)

People

Artist: Martin Schweig, American 20th century

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9317

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Furniture 99.5
Chair 99.5
Person 98
Human 98
Dog 97.6
Mammal 97.6
Canine 97.6
Pet 97.6
Animal 97.6
Couch 96.4
Person 93.4
Clothing 91.1
Apparel 91.1
Sitting 83.7
Face 68.5
Portrait 62.6
Photography 62.6
Photo 62.6
Female 59.7
Screen 55.2
Electronics 55.2

Clarifai
created on 2023-10-26

people 99.8
monochrome 99.2
canine 99
two 98.9
dog 98.9
adult 98.3
group 97.5
man 97.4
group together 96.2
three 94.9
portrait 94.3
facial expression 94.2
four 93.9
woman 92.5
indoors 92.5
actor 91.2
humor 90.2
sit 89.7
sitting 88.1
wear 87.3

Imagga
created on 2022-01-23

negative 100
film 80.6
photographic paper 62.3
photographic equipment 41.5
senior 37.5
man 36.9
people 30.7
male 30.5
elderly 28.7
person 27
old 26.5
computer 25
adult 24.8
laptop 24.8
retirement 24
office 23.9
home 22.3
mature 21.4
work 20.4
portrait 20.1
business 20
sitting 19.7
happy 19.4
retired 18.4
looking 17.6
working 16.8
businessman 15.9
couple 15.7
professional 15.3
men 14.6
smiling 14.5
technology 14.1
indoors 14
lifestyle 13.7
screen 13.7
casual 13.5
smile 13.5
indoor 12.8
together 12.3
aged 11.8
older 11.6
worker 11.6
family 11.6
holding 11.5
job 11.5
husband 11.4
face 10.6
monitor 10.6
pensioner 10.2
occupation 10.1
communication 10.1
horizontal 10
active 9.9
one 9.7
room 9.7
age 9.5
women 9.5
camera 9.2
health 9
sculpture 9
70s 8.8
medical 8.8
table 8.8
notebook 8.8
grandfather 8.7
education 8.7
wife 8.5
meeting 8.5
relaxed 8.4
modern 8.4
leisure 8.3
teacher 8.2
lady 8.1
gray 8.1
success 8
handsome 8
hair 7.9
love 7.9
citizen 7.9
happiness 7.8
grandmother 7.8
60s 7.8
desk 7.8
career 7.6
house 7.5
teamwork 7.4
alone 7.3
group 7.2
team 7.2
statue 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.2
person 89.7
black and white 73.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Happy 97.2%
Surprised 1.2%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%
Sad 0.2%
Calm 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Dog 97.6%

Categories

Captions