Human Generated Data

Title

Untitled (four women looking at gift in front large table of many gifts)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9379

Human Generated Data

Title

Untitled (four women looking at gift in front large table of many gifts)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9379

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.1
Person 98.9
Person 97.5
Person 96.7
Person 90.5
Helmet 90.4
Clothing 90.4
Apparel 90.4
Room 85
Indoors 85
Furniture 74.3
Workshop 65.5
Monitor 64.3
Electronics 64.3
Display 64.3
Screen 64.3
Living Room 63.3
People 63
Bedroom 57.7

Clarifai
created on 2023-10-27

people 99.8
group together 98.7
group 98.4
adult 97.9
man 95.5
three 94.1
vehicle 94.1
woman 93.6
administration 92.4
war 91.9
furniture 91.9
several 91.2
military 91.1
two 91
many 89.6
monochrome 88.4
music 85.9
child 85.4
wear 83.9
five 81.3

Imagga
created on 2022-01-23

chair 30.8
salon 30.5
person 29.4
people 28.4
man 27.5
room 25.6
male 24.8
table 23.7
indoors 22
adult 20.1
office 19.6
musical instrument 18.9
interior 18.6
women 18.2
business 17.6
home 17.5
smiling 16.6
sitting 16.3
men 16.3
shop 16.3
lifestyle 15.9
classroom 15.4
group 15.3
teacher 15.1
businessman 15
work 15
modern 14.7
happy 14.4
barbershop 13.9
barber chair 13.5
furniture 13.3
happiness 13.3
indoor 12.8
job 12.4
cheerful 12.2
smile 12.1
two 11.9
desk 11.7
house 11.7
portrait 11.6
seat 11.3
computer 11.3
mature 11.2
stringed instrument 11.1
casual 11
class 10.6
working 10.6
education 10.4
meeting 10.4
mercantile establishment 10.1
communication 10.1
team 9.9
kitchen 9.8
professional 9.8
together 9.6
wind instrument 9.6
couple 9.6
living 9.5
study 9.3
laptop 9.1
businesswoman 9.1
holding 9.1
board 9
bowed stringed instrument 9
life 9
discussion 8.8
school 8.8
boy 8.7
outfit 8.7
corporate 8.6
executive 8.5
enjoyment 8.4
elegance 8.4
leisure 8.3
worker 8.3
looking 8
glass 7.8
violin 7.7
pretty 7.7
exam 7.7
youth 7.7
comfortable 7.6
enjoying 7.6
adults 7.6
friends 7.5
fun 7.5
manager 7.4
floor 7.4
stage 7.4
coffee 7.4
inside 7.4
window 7.3
confident 7.3
student 7.2
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.9
black and white 91.1
person 86.4
clothing 54.7
old 42.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 97.8%
Happy 30.7%
Sad 30.1%
Calm 20.2%
Confused 10.2%
Surprised 3.2%
Disgusted 2.6%
Angry 2.3%
Fear 0.7%

AWS Rekognition

Age 37-45
Gender Female, 98.6%
Calm 95.2%
Sad 2.4%
Happy 1%
Confused 0.5%
Disgusted 0.4%
Surprised 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%
Helmet 90.4%

Text analysis

Amazon

36
KODVK-SVEELA
10201

Google

MAGOX
MAGOX