Human Generated Data

Title

Untitled (woman with baby on lap in car seat)

Date

1947, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.236

Human Generated Data

Title

Untitled (woman with baby on lap in car seat)

People

Artist: Jack Gould, American

Date

1947, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.236

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.4
Human 98.4
Person 83.9
Furniture 78.3
Shoe 74.4
Clothing 74.4
Footwear 74.4
Apparel 74.4
Car 72.9
Automobile 72.9
Transportation 72.9
Vehicle 72.9
Amusement Park 56.2

Clarifai
created on 2023-10-25

people 100
two 99.3
child 99.1
monochrome 99.1
adult 99
vehicle 98.7
man 97.4
one 97.1
portrait 96.8
three 96.5
woman 96.1
baby 96.1
group 95.5
offspring 95.4
hospital 94.6
transportation system 92.1
family 91.1
son 90.9
chair 90.7
furniture 89.5

Imagga
created on 2021-12-14

vehicle 27.1
interior 19.4
car 18.5
motor vehicle 18.3
forklift 18
transport 17.3
luxury 17.1
modern 16.8
wheeled vehicle 16.6
device 16.2
transportation 16.1
inside 15.6
iron lung 15
indoors 14.9
equipment 14.8
window 14.6
home appliance 14.6
room 14.5
work 14.3
truck 14.1
architecture 14
business 13.4
design 12.9
home 12.8
apartment 12.4
people 12.3
respirator 11.9
van 11.9
appliance 11.7
house 11.7
furniture 11.7
city 11.6
man 11.4
industry 11.1
3d 10.8
seat 10.7
new 10.5
urban 10.5
chair 10.3
hospital 10.2
white goods 10.1
breathing device 9.9
metal 9.6
building 9.6
light 9.4
kitchen appliance 9.3
glass 9.3
person 9.2
wood 9.2
toaster 9.1
structure 9
machine 9
refrigerator 8.9
computer 8.9
golf equipment 8.7
automobile 8.6
residential 8.6
auto 8.6
empty 8.6
floor 8.4
fashion 8.3
door 8
decor 8
working 7.9
lamp 7.8
driver 7.8
travel 7.7
sitting 7.7
wall 7.7
drive 7.6
leisure 7.5
technology 7.4
office 7.4
table 7.3
moving van 7.2
adult 7.1
job 7.1
steel 7.1
model t 7.1

Microsoft
created on 2021-12-14

text 98.5
person 71.9
black and white 67.7
land vehicle 64.1
vehicle 54.6
posing 42.1
old 41.9
vintage 27.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-70
Gender Female, 75.4%
Calm 81%
Confused 9.1%
Surprised 2.4%
Angry 2.3%
Happy 2.1%
Sad 1.5%
Fear 1%
Disgusted 0.6%

AWS Rekognition

Age 0-3
Gender Female, 82.4%
Calm 96%
Sad 1.1%
Angry 0.7%
Fear 0.7%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 49
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Shoe 74.4%
Car 72.9%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

Elder's
DA

Google

Sliders
Sliders